|
[Sponsors] |
OpenFoam crash with dynamicMesh, cyclicAMI and MPI DAPL UD |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
March 13, 2018, 11:15 |
OpenFoam crash with dynamicMesh, cyclicAMI and MPI DAPL UD
|
#1 |
New Member
Aaron Endres
Join Date: Jun 2016
Posts: 13
Rep Power: 10 |
Dear OpenFOAM users,
I have noticed a problem with the annularThermalMixer tutorial for rhoPimpleDyMFoam in newer OpenFOAM versions (3.0 to 5.0 and also v1606+ and v1706+, OpenFOAM 2.3.1 worked fine). When running rhoPimpleDyMFoam on 2 Haswell nodes (56 cores) with MPI DAPL enabled, the solver crashes. Usually this happens during the update of the cyclic AMI boundary condition. The error message is related to the communication between the nodes, as it does not appear when running on just one node with the same decomposition. The error occurrence is independent of the composition method (scotch or hierarchical). It is always something related to IO operations: [34] --> FOAM FATAL IO ERROR: [34] error in IOstream "IOstream" for operation operator>>(Istream&, List<T>&) : reading first token [34] file: IOstream at line 0. [34] From function void Foam::IOstream::fatalCheck(const char*) const [34] in file db/IOstreams/IOstreams/IOstream.C at line 109. FOAM parallel run exiting or [29] --> FOAM FATAL IO ERROR: [29] Expected a ')' or a '}' while reading List, found on line 0 the label 0 [29] file: IOstream at line 0. [29] From function char Foam::Istream::readEndList(const char*) [29] in file db/IOstreams/IOstreams/Istream.C at line 155. FOAM parallel run exiting With these MPI settings, the error occurs: export I_MPI_DAPL_UD=enableWhen switching to ofa instead of dapl, no errors occur: unset I_MPI_DAPL_UDHas anyone of you experienced similar problems? Do you think this is an OpenFOAM bug or do you think this has something to do with the MPI setup of the cluster I am running OpenFOAM on? The strange thing is that everything without dynamic mesh and cyclicAMI works fine with dapl. Thanks in advance for your help! Aaron |
|
February 12, 2019, 05:41 |
|
#2 |
New Member
Join Date: Jun 2018
Posts: 11
Rep Power: 8 |
Hi aendres
I was wondering if you had solved you issue? I am currently trying to implement a dynamic mesh for dsmcFoamPlus and get the same error. |
|
February 12, 2019, 07:34 |
|
#3 |
New Member
Aaron Endres
Join Date: Jun 2016
Posts: 13
Rep Power: 10 |
Hi xshmuel,
I still don't know what exactly causes the errors, but by changing the mpi settings as described above I was able to run the code without errors... |
|
February 24, 2019, 23:03 |
|
#4 | |
Member
Dongxu Wang
Join Date: Sep 2018
Location: China
Posts: 33
Rep Power: 8 |
Quote:
I am facing a very similar problem which makes me very confused. My program can run in serial model. But it will crash when ran parallel. I want to try your settings but I don't know what to do. Where could I modify this settings? Here is my problem: Why my program cannot ran in parallel??? Thank you very much! |
||
February 25, 2019, 03:48 |
|
#5 |
New Member
Aaron Endres
Join Date: Jun 2016
Posts: 13
Rep Power: 10 |
Hi Dongxu,
the mpi settings are modified with simple bash commands. Just copy the three 'unset' and 'export' commands to your bash script you are running on the cluster or to the terminal from which you are starting openfoam. You could also add them to your etc/bashrc file in your OpenFoam source directory after making sure that they fix your problem. |
|
February 25, 2019, 06:18 |
|
#6 | |
Member
Dongxu Wang
Join Date: Sep 2018
Location: China
Posts: 33
Rep Power: 8 |
Quote:
Thank you for your reply! I've tried the commands in my terminal but it doesn't work. I think this problem maybe a bug or something because my program could run in serial model. The critical problem is how to end up the pimple outer loop. Once my newly added criterion is utilised the problem will ocurr. So I think maybe I should pay more attention to the residuals control of PIMPLE algorithm. GL |
||
Tags |
cylcicami, dapl, dynamic mesh, openfoam 5.x |
|
|