CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM

stop when I run in parallel

Register Blogs Community New Posts Updated Threads Search

Like Tree5Likes

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   June 3, 2010, 14:30
Default DamBreak tuorial
  #21
Member
 
Join Date: Dec 2009
Posts: 39
Rep Power: 17
marval is on a distinguished road
Hi all!

I'm running through the tutorials and have problems with parallel running in the dam break tutorial.

This is the error I get;

Quote:
marco@marco-laptop:~/OpenFOAM/marco-1.6.x/run/tutorials/multiphase/interFoam/laminar/damBreakFine/system$ mpirun -np 4 interFoam -parallel > log &
[1] 27989
marco@marco-laptop:~/OpenFOAM/marco-1.6.x/run/tutorials/multiphase/interFoam/laminar/damBreakFine/system$ [0]
[0]
[0] --> FOAM FATAL ERROR:
[0] Cannot read "/home/marco/OpenFOAM/marco-1.6.x/run/tutorials/multiphase/interFoam/laminar/damBreakFine/system/system/decomposeParDict"
[0]
FOAM parallel run exiting
[0]
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 27990 on
node marco-laptop exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------

[1]+ Exit 1 mpirun -np 4 interFoam -parallel > log
And as I understand it, I should change the 'decomposeParDict'-file, currently it looks like this:

Quote:
/*--------------------------------*- C++ -*----------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.6 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
FoamFile
{
version 2.0;
format ascii;
class dictionary;
location "system";
object decomposeParDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //

numberOfSubdomains 4;

method simple;

simpleCoeffs
{
n ( 2 2 1 );
delta 0.001;
}

hierarchicalCoeffs
{
n ( 1 1 1 );
delta 0.001;
order xyz;
}

metisCoeffs
{
processorWeights ( 1 1 1 1 );
}

manualCoeffs
{
dataFile "";
}

distributed no;

roots ( );


// ************************************************** *********************** //
I think I'm trying to run with more processors (4 processors in the manual) than I have available (2 processors), but don't know exact how to fix it (probably in the file).

Regards
Marco
marval is offline   Reply With Quote

Old   June 3, 2010, 15:55
Default
  #22
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Marco!

This has happened to me more then once
Quote:
Originally Posted by marval View Post
Cannot read "/home/marco/OpenFOAM/marco-1.6.x/run/tutorials/multiphase/interFoam/laminar/damBreakFine/system/system/decomposeParDict"
Just do:
Code:
cd ..
and try again


By the way, personally I've grown use to using the script foamJob, so in your case, I would use:
Code:
foamJob -s -p interFoam
The advantage of foamJob is that it will launch foamExec prior to running the desired solver/utility, thus activating the OpenFOAM environment on the remote machine/node and then going on with the usual OpenFOAM business

Best regards,
Bruno
hua1015, wht and Teresa Sun like this.
__________________
wyldckat is offline   Reply With Quote

Old   June 4, 2010, 17:24
Default
  #23
Member
 
Join Date: Mar 2010
Posts: 31
Rep Power: 16
bunni is on a distinguished road
Ok, I'm back. After having installed openfoam 1.6.x, I'm having exactly the same problem running in parallel as I was before. It runs fine on a single processor.

I've tried to attach the output from the screen, which is what I posted above.
I'll try to post the details of the case in another message.
Attached Files
File Type: gz outputlog.gz (997 Bytes, 14 views)
bunni is offline   Reply With Quote

Old   June 4, 2010, 17:41
Default gtz with the file stuff
  #24
Member
 
Join Date: Mar 2010
Posts: 31
Rep Power: 16
bunni is on a distinguished road
Here should be the data to recreate the case. You'll need to run blockMesh on it, but hopefully the rest of the files are there. I've saved it as quart.tgz.gz because the uploader would not take quart.tgz. Therefore :
step 1 $ mv quart.tgz.gz quart.tgz
step 2 $ tar xvfz quart.tgz

and, should be created a directory tree called qcyl. You can descend into this to run blockMesh, etc.

It has been running for days with 1 proc, but crashes immediately with 2 or more. I've got a simple decomposition through the z-plane with 2 procs in the decomposeParDict.

Anyway, thanks for any ideas.
Attached Files
File Type: gz quart.tgz.gz (5.1 KB, 6 views)
bunni is offline   Reply With Quote

Old   June 6, 2010, 07:56
Default
  #25
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Bunni,

Well, the same things that happen with you, have happened with me too. I've confirmed that my OpenMPI is working with OpenFOAM, by testing parallelTest and the interFoam/laminar/damBreak case with dual core parallel execution.
edit: I forgot to mentioned that I used Ubuntu 8.04 i686 and OpenFOAM 1.6.x

I've managed to solve in part the error you get. Just edit the file "OpenFOAM-1.6.x/etc/settings.sh", find the variable minBufferSize and increase the buffer size:
Code:
# Set the minimum MPI buffer size (used by all platforms except SGI MPI)
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
minBufferSize=150000000
And don't forget to start a new terminal and use foamJob to ensure that the proper value is used.
But by the tests I've made of increasing the buffer size, it only seems to postpone the crash farther in time. interFoam seems to always crashes during the preparation part of the case. Now I know why other users report that it freezes... in fact, it just takes reaaally long playing around with meshes and memory and MPI messages, at least AFAIK, and sooner or later it will just crash without doing anything useful
edit2: yep, 400MB of buffer and it still crashes...

So, Bunni, I suggest that you try increasing that buffer variable, in an attempt to avoid the crashing. But my best bet is what I've said to you previously about OpenFOAM 1.6: this seems to be a bug in OpenFOAM, which apparently is yet to be fixed! So please post a bug report on the Bug report part of the OpenFOAM forum: http://www.cfd-online.com/Forums/openfoam-bugs/
If you want to save some time, you can refer to your post #23 from here and onward!

Best regards,
Bruno
__________________

Last edited by wyldckat; June 6, 2010 at 09:27. Reason: missed a few words... and forgot to mention Linux version
wyldckat is offline   Reply With Quote

Old   June 6, 2010, 21:56
Default thanks
  #26
Member
 
Join Date: Mar 2010
Posts: 31
Rep Power: 16
bunni is on a distinguished road
Thanks for checking that. I will post a bug report. I'm running on fedora and centos. I'll check out that variable change. As for right now, it's been running on a single processor without crashing for 4 days, so the run itself is stable.
bunni is offline   Reply With Quote

Old   September 20, 2011, 13:35
Default
  #27
New Member
 
Perry L. Johnson
Join Date: Feb 2011
Location: Orlando, FL, USA
Posts: 17
Rep Power: 15
PerryLJohnson is on a distinguished road
Hello,

I have recently encountered the same problem as Nolwenn and Gonzalo regarding the stopping of a solver at the first time loop without any error message or the job exiting the queue (procs still occupied at 100%). I am in OF1.7.1 (with gcc 4.5.1) on a cluster with RHEL 5.4. The issue only occurs when running large cases, smaller cases work perfectly fine; but, there is plenty of memory per node even for the large cases (~50GB). The parallelTest utility reports fine as suggested above. Is there any knowledge to fixing this issue besides switching compilers? If not, which compilers should I switch to for OF1.7.1, since there is no default compiler?

Thanks in advance for any helpful advice,
Perry
PerryLJohnson is offline   Reply With Quote

Old   September 20, 2011, 17:36
Default
  #28
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Greetings Perry,

Before I answer you, I just want to wrap up the solution to bunni's predicament - the thread where the solution is, is this one: http://www.cfd-online.com/Forums/ope...-parallel.html

Now back to you Perry: OK, when it comes to the issue of compiler version, there are two/three other libraries whose versions are also important, namely: GMP, MPFR and MPC. For example, from my experience, MPFR 3.0.0 doesn't work very well, so I still hang on to the older 2.4.2 version.
As for Gcc 4.5.1, it should work just fine with OpenFOAM 1.7.1. I might on the other hand, be triggering a couple of old bugs that have been solved since then. As I vaguely remember, they were related to some issues with cyclic or wedge or some other special type of patch, that would crash the solver when used in parallel. Aside from such old bugs, one still needs to use (if I'm not mistaken) the "preservePatches" parameter in decomposeParDict.

Either way, I've got a blog post where I'm gathering information on how to run in parallel with OpenFOAM (it's accessible from the link on my signature): Notes about running OpenFOAM in parallel
The ones that might interest you:
Knowing a bit more about the large case might help in trying to isolate the problem, namely:
  • Which decomposition method are you using?
  • What's the solver being used, or if it's a customized version, on which solver(s) is it based on?
  • What kinds of patches are involved? Any cyclic, wedge, baffle, etc...
  • What kind of turbulence models are being used, if any? RAS, LES, laminar or something else?
  • Have you tried gradually scaling up the size of your case? If so, did you take into account the respective calibration of the parameters in controlDict?
Last but not least, any chance of also trying OpenFOAM 2.0.1 or 2.0.x? Because if you are triggering a bug, it'll be easier to get help on this problem on the dedicated bug tracker.

Best regards,
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   September 20, 2011, 18:56
Default
  #29
New Member
 
Perry L. Johnson
Join Date: Feb 2011
Location: Orlando, FL, USA
Posts: 17
Rep Power: 15
PerryLJohnson is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Greetings Perry,

Now back to you Perry: OK, when it comes to the issue of compiler version, there are two/three other libraries whose versions are also important, namely: GMP, MPFR and MPC. For example, from my experience, MPFR 3.0.0 doesn't work very well, so I still hang on to the older 2.4.2 version.
Can you elaborate concerning what you mean by MPFR 3.0.0 does not work well?

Quote:
Originally Posted by wyldckat View Post
As for Gcc 4.5.1, it should work just fine with OpenFOAM 1.7.1. I might on the other hand, be triggering a couple of old bugs that have been solved since then. As I vaguely remember, they were related to some issues with cyclic or wedge or some other special type of patch, that would crash the solver when used in parallel. Aside from such old bugs, one still needs to use (if I'm not mistaken) the "preservePatches" parameter in decomposeParDict.
There is a cyclic patch in my case, however, a coarser mesh of the same domain and B.C., nor have I ever run into cyclic problems in past cases with 1.7.1.

Quote:
Originally Posted by wyldckat View Post
Either way, I've got a blog post where I'm gathering information on how to run in parallel with OpenFOAM (it's accessible from the link on my signature): Notes about running OpenFOAM in parallel
The ones that might interest you:
Thanks for the links!

Quote:
Originally Posted by wyldckat View Post
Knowing a bit more about the large case might help in trying to isolate the problem, namely:
  • Which decomposition method are you using?
  • What's the solver being used, or if it's a customized version, on which solver(s) is it based on?
  • What kinds of patches are involved? Any cyclic, wedge, baffle, etc...
  • What kind of turbulence models are being used, if any? RAS, LES, laminar or something else?
  • Have you tried gradually scaling up the size of your case? If so, did you take into account the respective calibration of the parameters in controlDict?
1) I've tried both simple and metis, sometimes metis stops at "Creating mesh for time = 0".

2) I'm using a custom solver based on simpleFoam (with an extra equation for passive scalar transport), but have also tested on simpleFoam itself with no difference.

3) I have one cyclic patch, 3 directMapped patches, and a number of inlets, outlets, and walls.

4) Right now, I'm using RAS, k-w SST.

5) I have not tried scaling the geometry, but I have run the same geometry on a coarser mesh successfully with the same boundary conditions. I only experience this problem on my fine mesh.

Quote:
Originally Posted by wyldckat View Post
Last but not least, any chance of also trying OpenFOAM 2.0.1 or 2.0.x? Because if you are triggering a bug, it'll be easier to get help on this problem on the dedicated bug tracker.

Best regards,
Bruno
This is a possibility if no simpler solutions exist.

Thanks very much for your help,
Perry
PerryLJohnson is offline   Reply With Quote

Old   September 21, 2011, 16:11
Default
  #30
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Perry,

Quote:
Originally Posted by PerryLJohnson View Post
Can you elaborate concerning what you mean by MPFR 3.0.0 does not work well?
See here as an example: https://github.com/OpenCFD/OpenFOAM-...ttings.sh#L119 - there you can see reference to the gcc+mpfr+gmp+mpc versions defined by default for OpenFOAM 1.7.x, which were defined and tested when 1.7.1 was released. If the Gcc build you are using happens to be linked to MPFR 3.0.0/1, then this might be one of the reasons, since there are some problems with mathematical operations, if I remember correctly.
Oh, here is the link to the makeGcc file on ThirdParty 2.0.x: https://github.com/OpenFOAM/ThirdPar...master/makeGcc - as you can see, Gcc 4.5.x needs MPFR, GMP and MPC to build properly.


Quote:
Originally Posted by PerryLJohnson View Post
There is a cyclic patch in my case, however, a coarser mesh of the same domain and B.C., nor have I ever run into cyclic problems in past cases with 1.7.1.
When the domain is decomposed, one might get lucky and not get the cyclic patch split in half between sub-domains. When luck runs out, the preserve patches parameter is a must.


Quote:
Originally Posted by PerryLJohnson View Post
1) I've tried both simple and metis, sometimes metis stops at "Creating mesh for time = 0".

2) I'm using a custom solver based on simpleFoam (with an extra equation for passive scalar transport), but have also tested on simpleFoam itself with no difference.

3) I have one cyclic patch, 3 directMapped patches, and a number of inlets, outlets, and walls.

4) Right now, I'm using RAS, k-w SST.

5) I have not tried scaling the geometry, but I have run the same geometry on a coarser mesh successfully with the same boundary conditions. I only experience this problem on my fine mesh.
1) Please remind me: then what happens with simple decomposition? When does it stop?
2) OK, then the problem must be elsewhere...
3) Are the directMapped patches also protected by the preserve patches parameter?
4) OK, seems pretty standard...
5) Have you tried visualizing the sub-domains in ParaView, to check where things are being split?

Have you executed checkMesh on the fine resolution mesh before decomposing, to verify if the mesh is OK?

There is an environment variable that OpenMPI uses that is defined in settings.sh... ah, line 347: https://github.com/OpenCFD/OpenFOAM-...ttings.sh#L347 - try increasing that value, perhaps 10x. Although this is only a valid solution in some cases.

And I know I've seen more reports like this before... and if I'm not mistaken, most were related to the patches being split between sub-domains, but my memory hasn't been very trustworthy lately
If my memory gets better, I'll search for what I've read in the past and post here.

...Wait... maybe it's the nonBlocking flag: https://github.com/OpenCFD/OpenFOAM-...ntrolDict#L875 - have you tried with other possibilities for the parameter commsType? I know there was a bug report a while back that was fixed in 2.0.x... here we go, it's in fact related to "directMappedPatch", although it might not affect your case: http://www.openfoam.com/mantisbt/view.php?id=280

Best regard and good luck!
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   September 21, 2011, 21:05
Default
  #31
New Member
 
Perry L. Johnson
Join Date: Feb 2011
Location: Orlando, FL, USA
Posts: 17
Rep Power: 15
PerryLJohnson is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Hi Perry,

See here as an example: https://github.com/OpenCFD/OpenFOAM-...ttings.sh#L119 - there you can see reference to the gcc+mpfr+gmp+mpc versions defined by default for OpenFOAM 1.7.x, which were defined and tested when 1.7.1 was released. If the Gcc build you are using happens to be linked to MPFR 3.0.0/1, then this might be one of the reasons, since there are some problems with mathematical operations, if I remember correctly.
Oh, here is the link to the makeGcc file on ThirdParty 2.0.x: https://github.com/OpenFOAM/ThirdPar...master/makeGcc - as you can see, Gcc 4.5.x needs MPFR, GMP and MPC to build properly.
This is a possibility, since the current setup uses MPFR 3.0.0 with Gcc 4.5.

Quote:
Originally Posted by wyldckat View Post
When the domain is decomposed, one might get lucky and not get the cyclic patch split in half between sub-domains. When luck runs out, the preserve patches parameter is a must.


1) Please remind me: then what happens with simple decomposition? When does it stop?
metis stops while building the mesh, which could be well explained by the lack of patch-preservation...thanks for that tip...

simple stops after building the mesh and fields, while starting the first time loop: "Time = 1", as if it is taking hours to complete the U-eqn; it also stops here when running serially (just tested)

Quote:
Originally Posted by wyldckat View Post
2) OK, then the problem must be elsewhere...
3) Are the directMapped patches also protected by the preserve patches parameter?
4) OK, seems pretty standard...
5) Have you tried visualizing the sub-domains in ParaView, to check where things are being split?
As for the preserve patches on the cyclic, I think that may be the issue with metis (it stops while building the mesh), but not for simple decomposition or serial runs (stops while performing U-Eqn at first time step).

Preserving patches for directMapped does not make sense to me, since the directMapped patch is not a shared boundary situation, but rather a case where the inlet looks to the nearest interior cell to a given offset location and finds the value there.

Is there a good way to visualize all of the sub-domains in one paraview session?

Quote:
Originally Posted by wyldckat View Post
Have you executed checkMesh on the fine resolution mesh before decomposing, to verify if the mesh is OK?
Three notifications from check mesh:
1) Two regions not connected by any faces (which is purposeful for my simulation, e.g. one region feeds the other via directMappedPatch).

2) 156 Non-orthogonal faces, but still says OK (max 86.4).

3) 3 skew faces, says that mesh fails, but this has not stopped me in the past. Would you think this problem could be related to 3 skewed faces (max skewness 4.66)? Doesn't seem like skew cells could prevent the solver from running, but I could be wrong...?

Quote:
Originally Posted by wyldckat View Post
There is an environment variable that OpenMPI uses that is defined in settings.sh... ah, line 347: https://github.com/OpenCFD/OpenFOAM-...ttings.sh#L347 - try increasing that value, perhaps 10x. Although this is only a valid solution in some cases.
Already did that

Quote:
Originally Posted by wyldckat View Post
And I know I've seen more reports like this before... and if I'm not mistaken, most were related to the patches being split between sub-domains, but my memory hasn't been very trustworthy lately
If my memory gets better, I'll search for what I've read in the past and post here.
Setting the cyclic patch to be preserved does not fix for simple decomposition (just attempted today).

Quote:
Originally Posted by wyldckat View Post
...Wait... maybe it's the nonBlocking flag: https://github.com/OpenCFD/OpenFOAM-...ntrolDict#L875 - have you tried with other possibilities for the parameter commsType? I know there was a bug report a while back that was fixed in 2.0.x... here we go, it's in fact related to "directMappedPatch", although it might not affect your case: http://www.openfoam.com/mantisbt/view.php?id=280

Best regard and good luck!
Bruno
The bug report states that the 'blocking' option is faulty but that the other two are ok. I have 'nonBlocking' enabled.

Thanks for your continued ideas,
Perry
PerryLJohnson is offline   Reply With Quote

Old   September 22, 2011, 15:41
Default
  #32
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Perry,

Quote:
Originally Posted by PerryLJohnson View Post
This is a possibility, since the current setup uses MPFR 3.0.0 with Gcc 4.5.
Well, AFAIK that isn't a supported combination of versions for building OpenFOAM, so my first bet would be to play it on the safe side. Any chance there is a gcc 4.4.x or 4.3.x lying around in the systems you have access to?


Quote:
Originally Posted by PerryLJohnson View Post
metis stops while building the mesh, which could be well explained by the lack of patch-preservation...thanks for that tip...
You're welcome
Quote:
Originally Posted by PerryLJohnson View Post
simple stops after building the mesh and fields, while starting the first time loop: "Time = 1", as if it is taking hours to complete the U-eqn; it also stops here when running serially (just tested)
Ah, now we are getting somewhere! If it stops when running in serial/single process, don't expect it to run in parallel! I had forgotten that this was one of the reasons why I asked about the turbulence model... I don't have much experience in this, but I do know that improper definitions for certain characteristic parameters or setting wrongly the boundary conditions will lead to simulations not running at all or crashing sooner or later. Such example would be bad initial values for the turbulence models, or for the parameters themselves.

Quote:
Originally Posted by PerryLJohnson View Post
As for the preserve patches on the cyclic, I think that may be the issue with metis (it stops while building the mesh), but not for simple decomposition or serial runs (stops while performing U-Eqn at first time step).

Preserving patches for directMapped does not make sense to me, since the directMapped patch is not a shared boundary situation, but rather a case where the inlet looks to the nearest interior cell to a given offset location and finds the value there.
It's always good to test things, just in case...

Quote:
Originally Posted by PerryLJohnson View Post
Is there a good way to visualize all of the sub-domains in one paraview session?
There are at least two ways of doing this:
  1. Using the internal reader in ParaView 3.8.x or 3.10.x. The internal reader uses the file extension ".foam". Run:
    Code:
    touch case.foam
    and open this file with ParaView. There should be on the object inspector an option to see the decomposed mesh.
    The decomposed mesh will appear in a single mesh volume, as if it were the serial case, but with the exception that it will show the processor boundary surfaces. Using the filters:
    • Extract Surface
      • Extract cells by region
    You can then see only the surfaces between processors.
  2. Using the official reader, which uses the file extension ".OpenFOAM", you'll have to create a file for each processor and open each one manually. You can generate the files for each processor like this:
    Code:
    for a in processor*; do paraFoam -touch -case $a; done
Quote:
Originally Posted by PerryLJohnson View Post
Three notifications from check mesh:
1) Two regions not connected by any faces (which is purposeful for my simulation, e.g. one region feeds the other via directMappedPatch).

2) 156 Non-orthogonal faces, but still says OK (max 86.4).

3) 3 skew faces, says that mesh fails, but this has not stopped me in the past. Would you think this problem could be related to 3 skewed faces (max skewness 4.66)? Doesn't seem like skew cells could prevent the solver from running, but I could be wrong...?
You can try using setSet to remove the damaged cells associated to those faces: http://openfoamwiki.net/index.php/SetSet - the mesh will be missing a few cells, but at least you can verify if these are the guilty party or not.

Quote:
Originally Posted by PerryLJohnson View Post
Setting the cyclic patch to be preserved does not fix for simple decomposition (just attempted today).
(...)
The bug report states that the 'blocking' option is faulty but that the other two are ok. I have 'nonBlocking' enabled.
As for these two points, my guess is that if it's not working in serial, then it's unlikely it will work in parallel...

Best regards and good luck!
Bruno
hua1015 likes this.
__________________
wyldckat is offline   Reply With Quote

Old   September 22, 2011, 23:40
Default
  #33
New Member
 
Perry L. Johnson
Join Date: Feb 2011
Location: Orlando, FL, USA
Posts: 17
Rep Power: 15
PerryLJohnson is on a distinguished road
Bruno,

Quote:
Originally Posted by wyldckat View Post
Well, AFAIK that isn't a supported combination of versions for building OpenFOAM, so my first bet would be to play it on the safe side. Any chance there is a gcc 4.4.x or 4.3.x lying around in the systems you have access to?
Tested on machine with gcc 4.4, problem remains...

Quote:
Originally Posted by wyldckat View Post
Ah, now we are getting somewhere! If it stops when running in serial/single process, don't expect it to run in parallel! I had forgotten that this was one of the reasons why I asked about the turbulence model... I don't have much experience in this, but I do know that improper definitions for certain characteristic parameters or setting wrongly the boundary conditions will lead to simulations not running at all or crashing sooner or later. Such example would be bad initial values for the turbulence models, or for the parameters themselves.
Runs perfectly fine with exact same setup on coarser mesh (I literally replace the polyMesh dictionary and it changes from working to not-working).

I have narrowed it down to one of the directMapped B.C.'s (the other two are fine). When I switch it to fixedValue, everything works fine. Switch it back to direct mapped, and it stalls at the first Time loop. The checkMesh utility gives me a cellToRegions file, suggesting that I should use the splitMeshRegions utility and use two different regions. The problematic directMapped boundary pulls its data from a separate domain of the flow that is not connected geometrically. Could this explain the problems I am having? (I am completely unfamiliar with multiple regions in OF, other that the little reading I have done today.)

Quote:
*Number of regions: 2
The mesh has multiple regions which are not connected by any face.
<<Writing region information to "0/cellToRegion"
As always, thanks for the insight you provide,

Regards,
Perry
PerryLJohnson is offline   Reply With Quote

Old   September 23, 2011, 15:36
Default
  #34
Retired Super Moderator
 
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128
wyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to allwyldckat is a name known to all
Hi Perry,

Mmm... well, at least MPFR doesn't seem to be the one to blame... for now

And this is getting further into details that I'm not very familiar with either. Did checkMesh on the coarser mesh give you the same information, that it should divide the mesh into two separate regions?

For multiple regions, I only know about two solvers that should support this (I don't know if all other solvers support this or not); they are (and respective tutorials):
If I'm not mistaken, in OpenFOAM 2.0.x was introduced a boundary condition of type "Fan" which might be the type of variable you're looking for, although I'm not so sure... Here's a thread that asks about it, although no specific information is posted there: http://www.cfd-online.com/Forums/ope...enfoam200.html

Best regards and Good luck!
Bruno
__________________
wyldckat is offline   Reply With Quote

Old   September 24, 2011, 15:51
Default
  #35
New Member
 
Perry L. Johnson
Join Date: Feb 2011
Location: Orlando, FL, USA
Posts: 17
Rep Power: 15
PerryLJohnson is on a distinguished road
Bruno,

I really appreciate all your help with this issue, and the side-tips along the way. I narrowed it down to the influence of the directMapped boundary condition with the fine mesh I was using, so I played around with meshing until I got one to work. I seem to have resolved the issue just by using a different mesh.

Sincere regards,
Perry
PerryLJohnson is offline   Reply With Quote

Old   May 26, 2017, 18:36
Default non-interactive parallel run
  #36
New Member
 
elham usefi
Join Date: Apr 2016
Location: tabriz,iran
Posts: 13
Rep Power: 10
elham usefi is on a distinguished road
greetings all!
I have installed OF-2.4.0(with gcc-4.8.1 , gmp-5.1.2 , mpc-1.0.1 , mpfr-3.1.2 ) on a cluster with CentOS 6.5 with this instructions
HTML Code:
https://openfoamwiki.net/index.php/Installation/Linux/OpenFOAM-2.3.0/CentOS_SL_RHEL
I’m running pitzdaily tutorial. Both serial and parallel runs go perfectly. The problem comes when I want to run non-interactively. Serial runs continue after I close putty window but parallel runs don’t! (no matter hove many processors I use)
I use this command
Code:
$ nohup foamJob -s -p simpleFoam &
System OpenMpi is 1.6.2 and I’ve installed OF with both OpenMpi-1.8.5 and 1.6.2, but D problem is D same!
nohup.out is like
Code:
3 total processes killed (some possibly by mpirun during cleanup)".
Has anybody any idea what’s happenning?
elham usefi is offline   Reply With Quote

Old   March 21, 2021, 05:56
Wink
  #37
New Member
 
Febriyan Prayoga
Join Date: Apr 2016
Location: Seoul
Posts: 21
Rep Power: 10
febriyan91 is on a distinguished road
Quote:
Originally Posted by wyldckat View Post
Hi Bunni,

Well, the same things that happen with you, have happened with me too. I've confirmed that my OpenMPI is working with OpenFOAM, by testing parallelTest and the interFoam/laminar/damBreak case with dual core parallel execution.
edit: I forgot to mentioned that I used Ubuntu 8.04 i686 and OpenFOAM 1.6.x

I've managed to solve in part the error you get. Just edit the file "OpenFOAM-1.6.x/etc/settings.sh", find the variable minBufferSize and increase the buffer size:
Code:
# Set the minimum MPI buffer size (used by all platforms except SGI MPI)
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
minBufferSize=150000000
And don't forget to start a new terminal and use foamJob to ensure that the proper value is used.
But by the tests I've made of increasing the buffer size, it only seems to postpone the crash farther in time. interFoam seems to always crashes during the preparation part of the case. Now I know why other users report that it freezes... in fact, it just takes reaaally long playing around with meshes and memory and MPI messages, at least AFAIK, and sooner or later it will just crash without doing anything useful
edit2: yep, 400MB of buffer and it still crashes...

So, Bunni, I suggest that you try increasing that buffer variable, in an attempt to avoid the crashing. But my best bet is what I've said to you previously about OpenFOAM 1.6: this seems to be a bug in OpenFOAM, which apparently is yet to be fixed! So please post a bug report on the Bug report part of the OpenFOAM forum: http://www.cfd-online.com/Forums/openfoam-bugs/
If you want to save some time, you can refer to your post #23 from here and onward!

Best regards,
Bruno

Hi Bruno, I am using OF v2012 on Ubuntu 20.04 Focal Fossa.
I could not find file "setting.sh" in "OpenFoam-v2012/etc" folder. I found the file "settings.sh" inside "OpenFoam-v2012/etc/config.csh" and "OpenFoam-v2012/etc/config.csh" folders. Unfortunately I could not find the string " minBufferSize".
I wonder whether my OpenFoam installation is correct or not. I followed this instruction: http://openfoamwiki.net/index.php/In...M-v1806/Ubuntu
I change the version string v1806 to v2012. I attach the settings files.

my big appreciation if you could give me some hints.

Thank you in advance..
Attached Files
File Type: txt settings from folder %22config.csh%22.txt (10.1 KB, 0 views)
File Type: txt settings from folder %22config.sh%22.txt (9.4 KB, 0 views)
febriyan91 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Unable to run OF in parallel on a multiple-node cluster quartzian OpenFOAM 3 November 24, 2009 14:37
Swap usage on parallel run nikhilesh OpenFOAM Bugs 1 April 30, 2009 05:42
Problem on Parallel Run Setup Hamidur Rahman CFX 0 September 23, 2007 18:11
Windows 64-bit, Distributed Parallel Run Issues... Erich CFX 3 March 28, 2006 17:36
Serial run OK parallel one fails r2d2 OpenFOAM Running, Solving & CFD 2 November 16, 2005 13:44


All times are GMT -4. The time now is 10:03.