|
[Sponsors] |
February 8, 2011, 13:00 |
new Solver won't run parallel
|
#1 |
Senior Member
Christian Lucas
Join Date: Aug 2009
Location: Braunschweig, Germany
Posts: 202
Rep Power: 18 |
Hi,
I have modified rhoPisoFoam a bit e.g. changed the energy equation (see link below). Now the variable htot is used instead of h. http://www.cfd-online.com/Forums/ope...-equation.html The solver runs fine when I use one processor with one core, but when I want to run a case parallel, the simulation crashes after the htot iteration with the following error. [morgoth:15364] *** on communicator MPI_COMM_WORLD [morgoth:15364] *** MPI_ERR_TRUNCATE: message truncated [morgoth:15364] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort) -------------------------------------------------------------------------- mpirun has exited due to process rank 1 with PID 15364 on node morgoth exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- Do I have to make any changes to openMPI or in openFoam to make the new solver run parallel Thanks for the help, Christian |
|
February 8, 2011, 19:26 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,982
Blog Entries: 45
Rep Power: 128 |
Greetings Christian,
Note to other readers: if you know better than what I wrote, please also respond! I'm not completely familiar with the way OpenFOAM handles MPI, but I believe that it should be as easy as:
Best regards, Bruno
__________________
|
|
April 16, 2011, 11:29 |
|
#3 |
Senior Member
Fabian Braennstroem
Join Date: Mar 2009
Posts: 407
Rep Power: 19 |
Hello,
did you could fix this problem... I have the error message sometimes for standard solvers as well, but do not know the reason for this!? I do not think that you need to update some list of fields... Regards! Fabian Btw. off-topic another problem I had with the nfs stales is fixed now; the raid-controller was the bad guy. |
|
April 18, 2011, 03:49 |
|
#4 |
Senior Member
Christian Lucas
Join Date: Aug 2009
Location: Braunschweig, Germany
Posts: 202
Rep Power: 18 |
Hi,
if I remember correctly, the problem was related to the command "correctBoundaryCondition" I used on a dummy field in my solver Regards, Christian |
|
January 10, 2012, 11:30 |
|
#5 | |
Senior Member
|
Quote:
You mentioned a dummy field in your solver, and at the moment I am looking for some kind of a dummy field or a dummy function myself. Can you share the code for that specific field with me? Maybe it already is the solution to my problems... Cheers, Bernhard |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
"Define_Profile" UDF for parallel solver | Antoine | Fluent UDF and Scheme Programming | 9 | February 29, 2016 07:09 |
RBF motion solver does not work well in parallel | lakeat | OpenFOAM Bugs | 3 | August 8, 2013 06:50 |
UDF problem with Parallel Solver | manu | FLUENT | 0 | January 24, 2008 15:31 |
Run in parallel a 2mesh case | cosimobianchini | OpenFOAM Running, Solving & CFD | 2 | January 11, 2007 07:33 |
MPICH Parallel Run Error on the Windows Server2003 | Saturn | CFX | 3 | August 29, 2006 09:42 |