|
[Sponsors] |
Foam-extend crashes with custom BC, but works fine in Debug mode |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
March 6, 2020, 19:47 |
Foam-extend crashes with custom BC, but works fine in Debug mode
|
#1 |
New Member
Arman N
Join Date: Mar 2019
Posts: 13
Rep Power: 7 |
Hello everyone,
I am using foam-extend 4.0 and have created a custom boundary condition based on constantAlphaContactAngle and it works fine with standard interFoam solver. When I use this BC with a custom solver based on interFoam, it works fine with some simple mesh geometries, but crashes with some slightly more complex meshes and gives the error: Code:
Segmentation fault (core dumped) I later recompiled to Debug mode to find out where the crash happens, and surprisingly, the code works fine with no errors. But still crashes when I switch back to Opt mode. So I couldn't find where the problem stems form. Now the problem is this: running in debug mode is very slow and I also couldn't manage to run it in parallel, where I get the following error: Code:
-------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- Does anyone have any idea on where the problem could be, so I could run it successfully in Opt mode or at least parallelize it in Debug mode? Many Thanks Arman |
|
March 12, 2020, 16:41 |
|
#2 |
New Member
Arman N
Join Date: Mar 2019
Posts: 13
Rep Power: 7 |
OK, the problem with parallel run was because of incompatibility between the installed openmpi version and the system's own openmpi. It was solved when I changed:
Code:
${WM_MPLIB:=OPENMPI} Code:
${WM_MPLIB:=SYSTEMOPENMPI} However, the problem with the Opt mode still remains and I have to run my simulations in Debug mode (which is still not as fast). I would appreciate if anyone could give any help to resolve this issue. Thanks. |
|
March 14, 2020, 09:20 |
|
#3 |
Senior Member
Daniel
Join Date: Mar 2013
Location: Noshahr, Iran
Posts: 348
Rep Power: 21 |
Hello,
"segmentation fault" error means that you are accessing a memory location that does not belong to the variable you are working with. Can you describe how do you implement the looping over boundaries? You are probably accessing an element beyond the array. Regarding the BC seemingly working fine in debug mode and failing in Opt, does it produce reasonable result? I don't have enough information about the differences between debug and Opt modes except for some basic aspects, but there is definitely something wrong in your code. In debug mode everything runs slower and probably your variable is still in memory and hence you don't get "segmentation fault" error. Regards, D. Khazaei |
|
March 14, 2020, 13:58 |
|
#4 |
New Member
Arman N
Join Date: Mar 2019
Posts: 13
Rep Power: 7 |
Hi Daniel,
Thanks for your response. The part of my code for looping over the boundary in AlphaContactAngleFvPatchScalarField is like this: Code:
{ // Lookup T const volScalarField& T = this->db().objectRegistry::lookupObject<volScalarField>("T"); // Lookup T on boundary patch const fvPatchScalarField& boundaryT = T.boundaryField()[patch().index()]; // Define contact angle (theta0) scalarField theta0(size(), scalar(0)); // Loop over theta0 forAll(theta0, i) //equivalent to for (int i=0; patch.size()<i; i++) { // theta0 as a function of T on boundary patch scalar omega = (boundaryT[i] - lowT_) / (highT_ - lowT_); theta0[i] = min(max(omega * thetaHighT_ + (1 - omega) * thetaLowT_, scalar(0)), scalar(180)); } return theta0; } The simulation results (produced in Debug mode) "seem" reasonable, and the BC "apparently" does what it is supposed to do. But I can't exactly verify how reliable the results are. Moreover, I don't think it is only the matter with Debug mode keeping the variables in memory, as the code also works fine in Opt mode with some simple meshes (i.e. 2D tube). |
|
March 15, 2020, 08:51 |
|
#5 | ||
Senior Member
Daniel
Join Date: Mar 2013
Location: Noshahr, Iran
Posts: 348
Rep Power: 21 |
Quote:
Quote:
1- How the theta0 is returned? 2- On more complex cases, boundaryT[i] and theta0[i] some how don't have the same size. Can you attached a simple reproducible case? |
|||
March 18, 2020, 10:27 |
|
#6 | ||||
New Member
Arman N
Join Date: Mar 2019
Posts: 13
Rep Power: 7 |
Quote:
Quote:
Quote:
And as I said, I have checked their sizes before. The code loops fine through the first boundary patch (i.e. upperWall), and their sizes are printed out to be equal. But then it won't even go to the next boundary (i.e. lower wall) and crashes before it has to check their sizes (the Info statement prior to it will not be shown). Furthermore, I don't think that this is something that Debug mode would be OK with to allow the program run without giving any errors. Quote:
Personally, I think this might be a bug in foam-extend in Opt mode which is resolved in Debug mode. Anyways, I'm not a pro in this. Thanks again |
|||||
March 22, 2020, 04:51 |
|
#7 |
Senior Member
Gerry Kan
Join Date: May 2016
Posts: 376
Rep Power: 11 |
Dear Arman:
When a code is compiled with the Debug option, optimized options are turned off. This is more likely why your code works in Debug mode. You could try recompiling the code in Release mode with -O0 first, and increase the optimization level until the seg fault reappears. I also don't know if OpenFOAM introduces more robust exception handling in Debug mode, which makes the compiled code more tolerant to, say, coding errors. A quick (but not conpletely certain) way of checking is to see if your results in those problematic cases still make sense. For amything deeper you might need a debugger. Gerry. |
|
September 26, 2020, 02:51 |
Solved!
|
#8 |
New Member
Arman N
Join Date: Mar 2019
Posts: 13
Rep Power: 7 |
After months, I looked through this issue again and this time I was finally able to figure out what was wrong with my case, with the help of the hints given in these links: here and here
I found out that if I set nOuterCorrectors to 0 in PIMPE settings, the code works fine with no errors also in the Opt mode. Therefore, the issue must be in the re-discretization of the momentum equation. The problem was here: Code:
const volScalarField& T = this->db().objectRegistry::lookupObject<volScalarField>("T"); When the outer correction loop takes place, either the indexing sequences or the whole fields change, while the reference to T is still pointing to something that is not there anymore. Therefore, when I defined T without referencing (by simply removing the &), the problem went away! I think that in Debug mode, the variables are kept in memory during the outer correction loop and that is the reason why the reference to T still had a meaning and the code worked fine in Debug mode. However, this doesn't seem to happen in the Opt mode. Thanks to Daniel and Gerry for their help and insights. Cheers. |
|
Tags |
constantalphacontactangle, custom boundary condition, debug mode, foam-extend 4.0, segmentation fault |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
whats the cause of error? | immortality | OpenFOAM Running, Solving & CFD | 13 | March 24, 2021 08:15 |
reactingFoam "maximum number of iterations exceeded" on mesh that works fine w/o chem | KarenRei | OpenFOAM Running, Solving & CFD | 7 | October 18, 2016 22:43 |
index out of range when accessing boundaryField in debug mode | sr_tenedor | OpenFOAM Bugs | 5 | April 14, 2013 12:17 |
[blockMesh] error message with modeling a cube with a hold at the center | hsingtzu | OpenFOAM Meshing & Mesh Conversion | 2 | March 14, 2012 10:56 |
Parallel runs with sonicDyMFoam crashes (works fine with sonicFoam) | jnilsson | OpenFOAM Running, Solving & CFD | 0 | March 9, 2012 07:45 |