|
[Sponsors] |
July 22, 2016, 05:39 |
Error running in parallel
|
#1 |
New Member
K.H
Join Date: Feb 2016
Posts: 15
Rep Power: 10 |
Hi foamers
I am using dns for simulation of turbulent channel flow with blockstructured meshes. When i run the case in parallel with only 4 processors everything is working fine. But as soon as I increase the number of processors using decomposepar i get the error massage: /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | foam-extend: Open Source CFD | | \\ / O peration | Version: 3.2 | | \\ / A nd | Web: http://www.foam-extend.org | | \\/ M anipulation | For copyright notice see file Copyright | \*---------------------------------------------------------------------------*/ Build : 3.2-334ba0562a2c Exec : PFoam -case ./ -parallel Date : Jul 22 2016 Time : 10:10:34 Host : knoten-13 PID : 7408 CtrlDict : "/home/studenten/stud-konhat/RZZN/Betrieb_Retau_180/Y+04_15/system/controlDict" Case : /home/studenten/stud-konhat/RZZN/Betrieb_Retau_180/Y+04_15 nProcs : 15 Slaves : 14 ( knoten-13.7409 knoten-13.7410 knoten-13.7411 knoten-13.7412 knoten-13.7413 knoten-13.7414 knoten-13.7415 knoten-13.7416 knoten-13.7417 knoten-13.7418 knoten-13.7419 knoten-13.7420 knoten-13.7421 knoten-13.7422 ) Pstream initialized with: nProcsSimpleSum : 16 commsType : blocking SigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0.002 [knoten-13:7419] *** An error occurred in MPI_Bsend [knoten-13:7419] *** on communicator MPI_COMM_WORLD [knoten-13:7419] *** MPI_ERR_BUFFER: invalid buffer pointer [knoten-13:7419] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort -------------------------------------------------------------------------- mpirun has exited due to process rank 11 with PID 7419 on node knoten-13 exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [knoten-13:07407] 2 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal [knoten-13:07407] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Did anyone have the same trouble?? |
|
July 24, 2016, 15:21 |
|
#2 |
Senior Member
Join Date: Sep 2010
Posts: 226
Rep Power: 17 |
Hi,
what happens if you change the method of the domain decomposition ? did you try ? regards, T.D. |
|
July 25, 2016, 06:54 |
|
#3 |
New Member
K.H
Join Date: Feb 2016
Posts: 15
Rep Power: 10 |
Thanks for your reply. I tried to change the method of the domain decomposition but the same error still appears. But I managed to avoid the error massage:
[knoten-13:7419] *** An error occurred in MPI_Bsend [knoten-13:7419] *** on communicator MPI_COMM_WORLD [knoten-13:7419] *** MPI_ERR_BUFFER: invalid buffer pointer [knoten-13:7419] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abor by increasing the MPI buffer size. |
|
July 25, 2016, 07:42 |
|
#4 |
Senior Member
Derek Mitchell
Join Date: Mar 2014
Location: UK, Reading
Posts: 172
Rep Power: 13 |
Has this ever worked?
When did it stop working?? what changed between now and then? Have you tried with a simple parallel tutorial case?
__________________
A CHEERING BAND OF FRIENDLY ELVES CARRY THE CONQUERING ADVENTURER OFF INTO THE SUNSET |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
problem during mpi in server: expected Scalar, found on line 0 the word 'nan' | muth | OpenFOAM Running, Solving & CFD | 3 | August 27, 2018 05:18 |
Explicitly filtered LES | saeedi | Main CFD Forum | 16 | October 14, 2015 12:58 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |
simpleFoam in parallel issue | plucas | OpenFOAM Running, Solving & CFD | 3 | July 17, 2013 12:30 |
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel | JR22 | OpenFOAM Running, Solving & CFD | 2 | April 19, 2013 17:49 |