CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Community Contributions

[PyFoam] running pyFoam(Plot)Runner.py in parallel

Register Blogs Community New Posts Updated Threads Search

Like Tree2Likes
  • 1 Post By JR22
  • 1 Post By gschaider

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 10, 2013, 04:44
Default running pyFoam(Plot)Runner.py in parallel
  #1
New Member
 
Join Date: Sep 2012
Location: Germany
Posts: 25
Rep Power: 14
Studi is on a distinguished road
Hello everybody!

I have troubles starting a case in parallel (only local on one machine with multiple cores). If I type
Code:
pyFoamRunner.py --procnr=2 simpleFoam
within my case I receive this error message:
Code:
PyFoam WARNING on line 144 of file /home/fem/OpenFOAM/PyFOAM-0.6.0/lib/python2.7/site-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for simpleFoam . Hoping for the best 
[M21556:06485] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
[M21556:06485] Warning: could not find environment variable "MPI_B"
--------------------------------------------------------------------------
mpirun was unable to launch the specified application as it could not find an executable:

Executable: UFFER_SIZE
Node: M21556

while attempting to start process rank 0.
--------------------------------------------------------------------------
2 total processes failed to start
Killing PID 6485
It doesn't work with any possible option. (--autosense-parallel, --procnr=N)

The weird thing is:
Without the parallel options (thus on a single core) pyFoamRunner.py works properly!
And on top:
If I start a parallel run "manually" with
Code:
mpirun -np 2 simpleFoam -parallel
it works without any problems, too!

Does anyone know, what to do? I really like pyFoam an would like to use it furthermore. Thanks an advance.


Regards
Sebastian
Studi is offline   Reply With Quote

Old   April 10, 2013, 05:45
Default
  #2
Assistant Moderator
 
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51
gschaider will become famous soon enoughgschaider will become famous soon enough
Quote:
Originally Posted by Studi View Post
Hello everybody!

I have troubles starting a case in parallel (only local on one machine with multiple cores). If I type
Code:
pyFoamRunner.py --procnr=2 simpleFoam
within my case I receive this error message:
Code:
PyFoam WARNING on line 144 of file /home/fem/OpenFOAM/PyFOAM-0.6.0/lib/python2.7/site-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for simpleFoam . Hoping for the best 
[M21556:06485] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
[M21556:06485] Warning: could not find environment variable "MPI_B"
--------------------------------------------------------------------------
mpirun was unable to launch the specified application as it could not find an executable:

Executable: UFFER_SIZE
Node: M21556

while attempting to start process rank 0.
--------------------------------------------------------------------------
2 total processes failed to start
Killing PID 6485
It doesn't work with any possible option. (--autosense-parallel, --procnr=N)

The weird thing is:
Without the parallel options (thus on a single core) pyFoamRunner.py works properly!
And on top:
If I start a parallel run "manually" with
Code:
mpirun -np 2 simpleFoam -parallel
it works without any problems, too!

Does anyone know, what to do? I really like pyFoam an would like to use it furthermore. Thanks an advance.


Regards
Sebastian
Hm. That is strange. The first warning says that "which simpleFoam" doesn't find an executable. And then it says that there is also no environment variable FOAM_MPI_LIBBIN (which may be OK for newer OF-installations. But then comes the weird part: the missing MPI_B and UFFER_SIZE (which it thinks that it is the executable) should be one string.

I think there is something problematic with the settings:
http://openfoamwiki.net/index.php/Co...yFoam#Settings

Check with pyFoamDumpConfiguration.py and look for the [MPI]-section. It should look somehow like this:
Code:
options_openmpi_post: ["-x","PATH","-x","LD_LIBRARY_PATH","-x","WM_PROJECT_DIR","-x","PYTHONPATH","-x","FOAM_MPI_LIBBIN","-x","MPI_BUFFER_SIZE","-x","MPI_ARCH_PATH"]
openmpi_add_prefix: False
options_openmpi_pre: ["--mca","pls","rsh","--mca","pls_rsh_agent","rsh"]
(especially MPI_BUFFER_SIZE should be one string)

To check which call to mpirun is actually used you can add this configuration option (will also print a lot of other things):
Code:
[Debug]
ParallelExecution: True
If MPI_BUFFER_SIZE looks OK in your configuration then I'm a bit surprised. What shell ("echo $SHELL") do you use?
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request
gschaider is offline   Reply With Quote

Old   April 10, 2013, 07:59
Default
  #3
Senior Member
 
JR22's Avatar
 
Jose Rey
Join Date: Oct 2012
Posts: 134
Rep Power: 18
JR22 will become famous soon enough
This is working for me with OF-2.2 and PyFoam-0.6.0:
Code:
pyFoamPlotRunner.py mpirun -np 12 simpleFoam -parallel | tee log/simpleFoam.log
I even have the pipe-tee in there working to get the log. The 12 is because I am hyperthreading on an i7-3930k, and for some reason (maybe my specific setting) it works better than just 6 cores.
vs1 likes this.
JR22 is offline   Reply With Quote

Old   April 10, 2013, 08:39
Default
  #4
Assistant Moderator
 
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51
gschaider will become famous soon enoughgschaider will become famous soon enough
Quote:
Originally Posted by JR22 View Post
This is working for me with OF-2.2 and PyFoam-0.6.0:
Code:
pyFoamPlotRunner.py mpirun -np 12 simpleFoam -parallel | tee log/simpleFoam.log
I even have the pipe-tee in there working to get the log. The 12 is because I am hyperthreading on an i7-3930k, and for some reason (maybe my specific setting) it works better than just 6 cores.
The tee is superfuous: pyFoamPlotRunner.py will automatically generate a file PyFoamRunner.mpirun.logfile. And usually the --proc=X options work quite fine and can be easily adapted with the configuration options if they're not working on your system. The problem that Studi has is that one of the options automatically passed to mpirun is "broken" (the options can be quite useful when the mpirun starts the run on two different physical machines as some environment variables are then not passed to "the other side")
vs1 likes this.
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request
gschaider is offline   Reply With Quote

Old   April 10, 2013, 10:18
Default
  #5
New Member
 
Join Date: Sep 2012
Location: Germany
Posts: 25
Rep Power: 14
Studi is on a distinguished road
Hello gschaider!

Direct hit with first shot! Yesterday I have edited the pyfoamrc file to get things working on different machines over network. I deleted the changes again and now it works.
There are two warnings again (see appending log), but the solver starts nonetheless.
I've read about the deprecated parameter on the openMPI homepage, but as long as it is working, this won't be any of my concerns. The same applies to the FOAM_MPI_LIBBIN.
Of course any suggestions for improvement are very welcome nevertheless!

Thanks a lot for helping me with this issue!

Reminding me of parallel computing with different machines: Is it necessary to distibute the according data on every node (with copying data on every node or via NFS)? I've read about MPI, that it can't "push" the data itself to every node automatically...


Regard
Sebastian

Code:
PyFoam WARNING on line 144 of file /home/fem/OpenFOAM/PyFOAM-0.6.0/lib/python2.7/site-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for simpleFoam . Hoping for the best 
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[M21556:04881] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.1.1                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.1.1-221db2718bbb
Exec   : simpleFoam -parallel
Date   : Apr 10 2013
Time   : 14:55:32
Host   : "M21556"
PID    : 4884
Case   : /home/fem/Berechnung/tetraMesh/tetraMesh10pyF
nProcs : 2
Slaves :
1
(
"M21556.4885"
)

Pstream initialized with:
floatTransfer     : 0
nProcsSimpleSum   : 0
commsType         : nonBlocking
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading field p

Reading field U

Reading/calculating face flux field phi

Selecting incompressible transport model powerLaw
Selecting RAS turbulence model laminar
No field sources present


Starting time loop
Studi is offline   Reply With Quote

Old   April 10, 2013, 13:20
Default
  #6
Assistant Moderator
 
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51
gschaider will become famous soon enoughgschaider will become famous soon enough
Quote:
Originally Posted by Studi View Post
Hello gschaider!

Direct hit with first shot! Yesterday I have edited the pyfoamrc file to get things working on different machines over network. I deleted the changes again and now it works.
There are two warnings again (see appending log), but the solver starts nonetheless.
I've read about the deprecated parameter on the openMPI homepage, but as long as it is working, this won't be any of my concerns. The same applies to the FOAM_MPI_LIBBIN.
Of course any suggestions for improvement are very welcome nevertheless!
The hardcoded options are not necessarily the best, only the ones that worked for me on most machines. But you can easily override them (even on a per-OF-version basis if you have different MPIs for different versions)

Quote:
Originally Posted by Studi View Post
Thanks a lot for helping me with this issue!

Reminding me of parallel computing with different machines: Is it necessary to distibute the according data on every node (with copying data on every node or via NFS)? I've read about MPI, that it can't "push" the data itself to every node automatically...
Every processor has to be able to "see" its processorX-directory. If they're all in the same NFS-directory and every node can access that then you're fine. Problem is that for a large number of processors NFS might be the bottle-neck and you'll want to distribute these directories onto multiple machines. But I haven't done that and would suggest you ask elsewhere on the Board
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request
gschaider is offline   Reply With Quote

Old   March 31, 2014, 15:15
Default which can't find my solver
  #7
Member
 
Ripudaman Manchanda
Join Date: May 2013
Posts: 55
Rep Power: 13
ripudaman is on a distinguished road
I have created my own solver which is based on solidDisplacementFoam in 2.3.x. I get the first warning that has been shown above:
Code:
 PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best
I am concerned regarding the implications of this warning.

Thanks in advance.
ripudaman is offline   Reply With Quote

Old   March 31, 2014, 20:51
Default
  #8
Assistant Moderator
 
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51
gschaider will become famous soon enoughgschaider will become famous soon enough
Quote:
Originally Posted by ripudaman View Post
I have created my own solver which is based on solidDisplacementFoam in 2.3.x. I get the first warning that has been shown above:
Code:
 PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best
I am concerned regarding the implications of this warning.

Thanks in advance.
If it runs then everything is fine. The main purpose of this warning is to give a hint if the solver is really not found
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request
gschaider is offline   Reply With Quote

Old   March 31, 2014, 23:56
Default
  #9
Member
 
Ripudaman Manchanda
Join Date: May 2013
Posts: 55
Rep Power: 13
ripudaman is on a distinguished road
The code is able to find my solver. However, it does not run as it should. I tried replacing my modified solver (convergeFracWidthFoam) with solidDisplacementFoam and the code worked using this command:
Code:
run=BasicRunner(argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()
as well as this code:
Code:
run=AnalyzedRunner(CONVERGED,silent=True,argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()
where the object CONVERGED is a custom LogAnalyzer object.

However when I replace solidDisplacementFoam with convergeFracWidthFoam in either of the above options, the code does not go through the iterations. In fact for the BasicRunner case it gives me the following error:
Code:
 PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best 
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[ubuntu:08918] Warning: could not find environment variable "PYTHONPATH"
[ubuntu:08918] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.3.x                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.3.x-e0d5f5a218ab
Exec   : convergeFracWidthFoam -case /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2 -parallel
Date   : Mar 31 2014
Time   : 21:47:12
Host   : "ubuntu"
PID    : 8921
Case   : /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2
nProcs : 4
Slaves :
3
(
"ubuntu.8922"
"ubuntu.8923"
"ubuntu.8924"
)

Pstream initialized with:
floatTransfer      : 0
nProcsSimpleSum    : 0
commsType          : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading mechanical properties

Normalising E : E/rho

Calculating Lame's coefficients

Plane Strain

Reading thermal properties

Reading field D

Calculating stress field sigmaDex

Calculating stress field sigmaD

Calculating explicit part of div(sigma) divSigmaExp


Calculating displacement field

Iteration: 1

Time = 1


One = 0  Two = 0

One = 0  Two = 0

One = 0  Two = 0
[1] #0  [2] #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&)[3] #0  Foam::error::printStack(Foam::Ostream&) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] # in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #2  2   in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2   in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #3   in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #3   in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #3


[3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[3] #4  __libc_start_main[1]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[1] #4  __libc_start_main[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[2] #4  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #5   in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #5   in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #5


[1] [3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08924] *** Process received signal ***
[ubuntu:08924] Signal: Floating point exception (8)
[ubuntu:08924] Signal code:  (-6)
[ubuntu:08924] Failing at address: 0x3e8000022dc
in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFr[ubuntu:08924] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f5ee6fb4425]
[ubuntu:08924] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08924] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f5ee6f9f76d]
[ubuntu:08924] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08924] *** End of error message ***
acWidthFoam"
[ubuntu:08922] *** Process received signal ***
[ubuntu:08922] Signal: Floating point exception (8)
[ubuntu:08922] Signal code:  (-6)
[ubuntu:08922] Failing at address: 0x3e8000022da
[ubuntu:08922] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f7c0749b425]
[ubuntu:08922] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08922] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f7c0748676d]
[ubuntu:08922] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08922] *** End of error message ***
[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08923] *** Process received signal ***
[ubuntu:08923] Signal: Floating point exception (8)
[ubuntu:08923] Signal code:  (-6)
[ubuntu:08923] Failing at address: 0x3e8000022db
[ubuntu:08923] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f04fcf57425]
[ubuntu:08923] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08923] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f04fcf4276d]
[ubuntu:08923] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08923] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 8922 on node ubuntu exited on signal 8 (Floating point exception).
--------------------------------------------------------------------------
3 total processes killed (some possibly by mpirun during cleanup)
The code works without the decomposition though.

I understand that this is a problem of creating a solver that can run in parallel. Can you help me out here?
ripudaman is offline   Reply With Quote

Old   April 1, 2014, 05:29
Default
  #10
Assistant Moderator
 
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51
gschaider will become famous soon enoughgschaider will become famous soon enough
Quote:
Originally Posted by ripudaman View Post
The code is able to find my solver. However, it does not run as it should. I tried replacing my modified solver (convergeFracWidthFoam) with solidDisplacementFoam and the code worked using this command:
Code:
run=BasicRunner(argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()
as well as this code:
Code:
run=AnalyzedRunner(CONVERGED,silent=True,argv=["solidDisplacementFoam","-case",work.name],lam=machine)
run.start()
where the object CONVERGED is a custom LogAnalyzer object.

However when I replace solidDisplacementFoam with convergeFracWidthFoam in either of the above options, the code does not go through the iterations. In fact for the BasicRunner case it gives me the following error:
Code:
 PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for convergeFracWidthFoam . Hoping for the best 
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[ubuntu:08918] Warning: could not find environment variable "PYTHONPATH"
[ubuntu:08918] Warning: could not find environment variable "FOAM_MPI_LIBBIN"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  2.3.x                                 |
|   \\  /    A nd           | Web:      www.OpenFOAM.org                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : 2.3.x-e0d5f5a218ab
Exec   : convergeFracWidthFoam -case /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2 -parallel
Date   : Mar 31 2014
Time   : 21:47:12
Host   : "ubuntu"
PID    : 8921
Case   : /home/ripuvm/OpenFOAM/ripuvm-2.3.x/multiFrac/cases/noFracTraj/try2
nProcs : 4
Slaves :
3
(
"ubuntu.8922"
"ubuntu.8923"
"ubuntu.8924"
)

Pstream initialized with:
floatTransfer      : 0
nProcsSimpleSum    : 0
commsType          : nonBlocking
polling iterations : 0
sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster
allowSystemOperations : Disallowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0

Reading mechanical properties

Normalising E : E/rho

Calculating Lame's coefficients

Plane Strain

Reading thermal properties

Reading field D

Calculating stress field sigmaDex

Calculating stress field sigmaD

Calculating explicit part of div(sigma) divSigmaExp


Calculating displacement field

Iteration: 1

Time = 1


One = 0  Two = 0

One = 0  Two = 0

One = 0  Two = 0
[1] #0  [2] #0  Foam::error::printStack(Foam::Ostream&)Foam::error::printStack(Foam::Ostream&)[3] #0  Foam::error::printStack(Foam::Ostream&) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #1  Foam::sigFpe::sigHandler(int) in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[1] # in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[3] #2  2   in "/opt/OpenFOAM/OpenFOAM-2.3.x/platforms/linux64GccDPOpt/lib/libOpenFOAM.so"
[2] #2   in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #3   in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #3   in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #3


[3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[3] #4  __libc_start_main[1]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[1] #4  __libc_start_main[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[2] #4  __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
[3] #5   in "/lib/x86_64-linux-gnu/libc.so.6"
[2] #5   in "/lib/x86_64-linux-gnu/libc.so.6"
[1] #5


[1] [3]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08924] *** Process received signal ***
[ubuntu:08924] Signal: Floating point exception (8)
[ubuntu:08924] Signal code:  (-6)
[ubuntu:08924] Failing at address: 0x3e8000022dc
in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFr[ubuntu:08924] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f5ee6fb4425]
[ubuntu:08924] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f5ee6fb44a0]
[ubuntu:08924] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08924] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f5ee6f9f76d]
[ubuntu:08924] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08924] *** End of error message ***
acWidthFoam"
[ubuntu:08922] *** Process received signal ***
[ubuntu:08922] Signal: Floating point exception (8)
[ubuntu:08922] Signal code:  (-6)
[ubuntu:08922] Failing at address: 0x3e8000022da
[ubuntu:08922] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f7c0749b425]
[ubuntu:08922] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f7c0749b4a0]
[ubuntu:08922] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08922] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f7c0748676d]
[ubuntu:08922] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08922] *** End of error message ***
[2]  in "/home/ripuvm/OpenFOAM/ripuvm-2.3.x/platforms/linux64GccDPOpt/bin/convergeFracWidthFoam"
[ubuntu:08923] *** Process received signal ***
[ubuntu:08923] Signal: Floating point exception (8)
[ubuntu:08923] Signal code:  (-6)
[ubuntu:08923] Failing at address: 0x3e8000022db
[ubuntu:08923] [ 0] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35) [0x7f04fcf57425]
[ubuntu:08923] [ 2] /lib/x86_64-linux-gnu/libc.so.6(+0x364a0) [0x7f04fcf574a0]
[ubuntu:08923] [ 3] convergeFracWidthFoam() [0x42b989]
[ubuntu:08923] [ 4] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x7f04fcf4276d]
[ubuntu:08923] [ 5] convergeFracWidthFoam() [0x43523d]
[ubuntu:08923] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 8922 on node ubuntu exited on signal 8 (Floating point exception).
--------------------------------------------------------------------------
3 total processes killed (some possibly by mpirun during cleanup)
The code works without the decomposition though.

I understand that this is a problem of creating a solver that can run in parallel. Can you help me out here?
I'm pretty sure that PyFoam is not the problem here: it just starts the OpenFOAM-solver (which it obiously did) in parallel (which according to your output worked too). After that it just waits for the output of the program. If you run your program without PyFoam (mpirun -n 3 convergeFracWidthFoam -parallel) you'll see the same behaviour.

My guess is that it is the common DidASumCalculationAndDividedByItWhichFailsInParall elBecauseOnOneProcessorTheSumIsZeroAndIDidntDoARed uce-bug. But it is hard to tell from your output because it is from a Release-version: before you do anything else compile yourself a Debug version. Stack-traces are much clearer (they even include the line numbers of where the problem occurred) and a lot of common errors are uncovered because of the bound-checking
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request
gschaider is offline   Reply With Quote

Old   September 6, 2015, 17:38
Default similar error message
  #11
Member
 
Eric Bryant
Join Date: Sep 2013
Location: Texas
Posts: 44
Rep Power: 13
codder is on a distinguished road
Hi I use PyFoam for parametric evaluation (it's an amazing tool I'm only just figuring out).

So I have scripted a series of templated cases for my custom solver. The output I get everytime throws a "hoping for the best" warning error:

Code:
 PyFoam WARNING on line 144 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/ParallelExecution.py : which can not find a match for apfSolidFoam . Hoping for the best 
^C

 Interrupted by the Keyboard
Killing PID 5965
New case1
But the solver inevitably executes correctly. However, (as you can see above) I just returned to the office to find that PyFoam had hung at the end of exedution. My log looks like this:

Code:
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
[namaste:05967] Warning: could not find environment variable "PYTHONPATH"
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A deprecated MCA parameter value was specified in the environment or
on the command line.  Deprecated MCA parameters should be avoided;
they may disappear in future releases.

Deprecated parameter: pls_rsh_agent
--------------------------------------------------------------------------
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | foam-extend: Open Source CFD                    |
|  \\    /   O peration     | Version:     3.1                                |
|   \\  /    A nd           | Web:         http://www.extend-project.de       |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build    : 3.1-f77b4801a214
Exec     : apfSolidFoam -case /home/eric/sharma/geomechanics/tutorials/apfSolidFoam/paper/caseTens0 -parallel
Date     : Sep 05 2015
Time     : 17:50:43
Host     : namaste
PID      : 5971
CtrlDict : /home/eric/foam/foam-extend-3.1/etc/controlDict
Case     : /home/eric/sharma/geomechanics/tutorials/apfSolidFoam/paper/caseTens0
nProcs   : 12
Slaves :
11
(
namaste.5972
namaste.5973
namaste.5974
namaste.5975
namaste.5976
namaste.5977
namaste.5978
namaste.5979
namaste.5980
namaste.5981
namaste.5982
)

Pstream initialized with:
floatTransfer     : 0
nProcsSimpleSum   : 0
commsType         : blocking
SigFpe   : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time
...


Code:
End

Finalising parallel run
[namaste:05971] *** Process received signal ***
[namaste:05971] Signal: Segmentation fault (11)
[namaste:05971] Signal code: Invalid permissions (2)
[namaste:05971] Failing at address: 0x7f9c851eae50
I think the that seg faults it causes PyFoam to hang.

So like is there any option I through to PlotRunner to just continue through into the next case after that seg fault?

Because the seg fault is at the end of the run so I dont care. That would be hecca useful. (I am worried that it'd be default behavoir, but somehow it doesn't happen for me. Like the story of my life).

Thanks, Eric

PS I use this config:

Code:
    print("Running solver")
    machine = LAMMachine(nr=procnrs[i])
    PlotRunner(args=["--proc=%d"%procnrs[i],
                     "--progress",
                     "--no-continuity",
                     "--hardcopy",
                     "--non-persist",
                     "apfSolidFoam",
                     "-case",work.name])
codder is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
error while running in parallel using openmpi on local mc 6 processors suryawanshi_nitin OpenFOAM 10 February 22, 2017 22:33
running OpenFoam in parallel vishwa OpenFOAM Running, Solving & CFD 22 August 2, 2015 09:53
Problems running in parallel - missing controlDict Argen OpenFOAM Running, Solving & CFD 4 June 7, 2012 04:50
Statically Compiling OpenFOAM Issues herzfeldd OpenFOAM Installation 21 January 6, 2009 10:38
Kubuntu uses dash breaks All scripts in tutorials platopus OpenFOAM Bugs 8 April 15, 2008 08:52


All times are GMT -4. The time now is 20:53.