|
[Sponsors] |
July 7, 2010, 06:19 |
Problem with mpirun with OpenFOAM
|
#1 |
Senior Member
Jie
Join Date: Jan 2010
Location: Australia
Posts: 134
Rep Power: 16 |
Recently I am having a strange problem with mpirun in OpenFOAM 1.6. I am running 3D LES of flow over a cylinder. I have 17 nodes (16 cells) in the spanwise direction (z-direction).
When I did a very coarse mesh (7500 cells per plane) in the xy plane, i can decompose the domain into 8 portions in the z-direction and run without any problem. When I did fine mesh (120,000 cells per plane) in the xy plane, I decompose the domain into 8 portions in the z-direction. After I executed the mpirun -np 8 pisoFoam -parallel, it complains the followings: // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0 [shuang:8030] *** An error occurred in MPI_Bsend [shuang:8030] *** on communicator MPI_COMM_WORLD [shuang:8030] *** MPI_ERR_BUFFER: invalid buffer pointer [shuang:8030] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort) -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 8030 on node shuang exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [shuang:08029] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [shuang:08029] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Then I decompose it into 2 portions along each the x, y (2 2 2 ) direction. It runs without this problem. If I wanna run the simulaiton with 8 portions along z-direction, what should I fix? Does anyone encounter such a problem as well? |
|
July 7, 2010, 06:37 |
|
#2 |
New Member
Sebastian Bartscher
Join Date: Jul 2010
Location: Münster, Germany
Posts: 6
Rep Power: 16 |
Hi,
I have the same problem. I'm running a simpleFoam simulation on an 8 core Workstation. I decomposed the mesh in 8 parts in x direction. The error doesn't appear every time. I read in another forum that there was/is a bug in openMpi that should be fixed. something about restarting checkpoints. would be great if someone could help. |
|
July 7, 2010, 10:17 |
|
#3 |
Member
|
You need to raise the mpi buffer size. This can be done by running something like:
Code:
MPI_BUFFER_SIZE=150000000 |
|
July 7, 2010, 20:30 |
|
#4 | |
Senior Member
Jie
Join Date: Jan 2010
Location: Australia
Posts: 134
Rep Power: 16 |
Quote:
it works well after I increase the buffer size. |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
problem with sampling Utility in openFOAM 1.6 | carmir | OpenFOAM Post-Processing | 10 | February 26, 2014 03:00 |
OpenFOAM 1.7.0 installation problem | stevek | OpenFOAM | 14 | December 1, 2010 16:30 |
OpenFOAM 1.6 CreatePatch Problem | TarifaPirata | OpenFOAM Bugs | 1 | September 10, 2009 05:35 |
Problem on Installing OpenFOAM 1.6 on Fedora 11 | lzgwhy | OpenFOAM Installation | 1 | August 25, 2009 12:02 |
OpenFOAM Debian packaging current status problems and TODOs | oseen | OpenFOAM Installation | 9 | August 26, 2007 14:50 |