|
[Sponsors] |
OF-3.0.1 parallel unstable but works well in single processor |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
January 16, 2017, 06:30 |
OF-3.0.1 parallel unstable but works well in single processor
|
#1 |
Member
Join Date: Feb 2014
Posts: 32
Rep Power: 12 |
Hi,
I posted in a different thread, but the title (and openfoam version) are no informative so I decided to open a new thread. The domain is a simple box and I use a hierarchical decomposition into 32 processors. I use my own solver based on the buoyant boussinesq solver. The simulation runs for a while with adaptable time step. The average time step is approximately 0.23. At ~50sec the parallel solver breaks. That is, courant number jumps to 1e11, the same with the pressure. When the simulation breaks I also tried using small fixed time steps (1e-3) in parallel, but to no avail. I reconstruct and run in single and it passes. Then I decompose and run in parallel and it runs for a while before it breaks (reaches 70). I have tried to increase the diffusion and change the decomposition but it didn't change the problem (just moved it to different times). I can post the configuration files (and the solver) if it helps. |
|
January 24, 2017, 04:48 |
|
#2 |
Member
Jean bon
Join Date: Sep 2016
Posts: 43
Rep Power: 10 |
Hello,
I have the same problem (I think) on OpenFOAM-dev and OpenFOAM-4.0. I think there is a problem with the decomposition. It seems that the decomposition of OpenFOAM is not very strong. On a little case, the calculation was lower with more processors (even though it is possible if this is a little mesh). I changed the method of decomposition and I used scotch method, and it ran very well, quicker than with one processor. So it depends of the method of decomposition (in the decomposeParDict). Now, I have the same problem with a bigger mesh but with scotch method. It is quicker with one processor than with 16. Try to change the method, the problem could come from there. |
|
January 24, 2017, 11:06 |
|
#3 |
Member
Join Date: Feb 2014
Posts: 32
Rep Power: 12 |
Hi,
Thanks, Yes it is a good idea. My investigations so far have led me to think that it is the mesh quality, but it could be the mesh decomposition as well. When I refine my mesh and limit Co to 0.05 (with adaptive time step) things work (but very slowly). |
|
January 24, 2017, 12:12 |
|
#4 |
Member
Jean bon
Join Date: Sep 2016
Posts: 43
Rep Power: 10 |
Yes, perhaps it is both. It is because of the mesh quality that the decomposition is bad. But in my case, I have already 700K cells and it is a structured grid, so I do not think it is the mesh quality (at worst, it is possible for the refinement but I will see).
|
|
January 25, 2017, 02:39 |
|
#5 |
Member
Join Date: Feb 2014
Posts: 32
Rep Power: 12 |
Did you get this behaviour with a native OF solver (which one?)
Did you use swak4foam? |
|
January 25, 2017, 03:49 |
|
#6 |
Member
Jean bon
Join Date: Sep 2016
Posts: 43
Rep Power: 10 |
Yes, it is with fireFoam. And no, I do not use swak4foam.
|
|
January 25, 2017, 10:31 |
|
#7 |
Member
Jean bon
Join Date: Sep 2016
Posts: 43
Rep Power: 10 |
See this thread :
Parallel Performance of Large Case Do you use the GAMG solver ? I will try to run my case without this solver to see the results. EDIT: I changed the solver but it is the same, still very low in parallel. I tried a lot of things concerning the method of decomposition and it does not work neither. Last edited by FlyingCat; January 25, 2017 at 12:26. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Decomposing meshes | Tobi | OpenFOAM Pre-Processing | 22 | February 24, 2023 10:23 |
decomposePar -allRegions | stru | OpenFOAM Pre-Processing | 2 | August 25, 2015 04:58 |
problem in running parallel: ./Allrun: 62: shift: can't shift that many | adambarfi | OpenFOAM Running, Solving & CFD | 10 | May 11, 2015 10:31 |
foam-extend_3.1 decompose and pyfoam warning | shipman | OpenFOAM | 3 | July 24, 2014 09:14 |
P4 1.5 or Dual P3 800EB on Gibabyte board | Danial | FLUENT | 4 | September 12, 2001 12:44 |