|
[Sponsors] |
[ImmersedBoundary] Immersed Boundary Method: Error Occurs in parallelization of icoIbFoam |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
October 29, 2016, 09:52 |
Immersed Boundary Method: Error Occurs in parallelization of icoIbFoam
|
#1 |
New Member
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 10 |
Hi Foamers:
As you know Immersed boundary method has been implemented in Openfoam-extend. Currently I am using foam-extend-4.0 version. I just try to run a cylinderInChannelFineIcoIbFoam simulation in the tutorial folder, and the simulation goes well in serial computation. Then I test it in parallel mode. I followed the description in "Allrun" file and do the following command: 1. blockMesh 2. cp save/boundary constant/polyMesh/ 3. mkdir 0 4. cp 0_org/* 0/ 5. decomposePar (I have modified the decomposeParDict file in system folder, 4 processors are specified) 6. mpirun -np 4 potentialIbFoam -parallel Then I got errors from step 6, which are: ************************************************** ************************************************* Create time Create mesh for time = 0 SIMPLE: no convergence criteria found. Calculations will run for 50 steps. Create immersed boundary cell mask Create immersed boundary face mask Found immersed boundary patch 0 named ibCylinder [3] Number of IB cells: 0 External flow [0] Number of IB cells: 72 [1] Number of IB cells: 0 [2] Number of IB cells: 72 Reading field p Reading field U Calculating potential flow [wudi-HOME:19883] *** An error occurred in MPI_Recv [wudi-HOME:19883] *** reported by process [3012755457,0] [wudi-HOME:19883] *** on communicator MPI_COMM_WORLD [wudi-HOME:19883] *** MPI_ERR_TRUNCATE: message truncated [wudi-HOME:19883] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [wudi-HOME:19883] *** and potentially your MPI job) ************************************************** ************************************************** I searched online, someone else report the same error in 2014. Here is the link: https://sourceforge.net/p/openfoam-e...ndrelease/260/ I have no idea whether this error or bug got solved or not. Please advise! Thank you in advance. |
|
October 29, 2016, 13:09 |
Update
|
#2 |
New Member
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 10 |
Hi,
It's me again. After a few hours investigation, before sleeping I post some updates to this issue. This error message was caused by the MPI communication type. Specifically, one should put commsType to be nonBlocking in $WM_PROJECT_DIR/etc/controlDict . The default value is blocking. However, this method is only applicable to the versions before foam-extend-3.2. From foam-extend-3.2, $WM_PROJECT_DIR/etc/controlDict has been removed, and $WM_PROJECT_DIR/etc/controlDict-SAMPLE is added to this folder. Re-name this controlDict-SAMPLE to controlDict won't work. Now the problem becomes more clear. The key issue is to change commsType in OptimisationSwitches to nonBlocking. I will update the solution later. If all the methods fail, maybe I will consider to downgrade the version back to 3.1. |
|
October 31, 2016, 00:48 |
Still confused
|
#3 |
New Member
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 10 |
Still confused. I have re-compiled the whole library with commsType set to be "nonBlocking", however the problem still exists.
My MPI version is OpenMPI 1.10.2, and parallel computation works fine in other standard solver such as "simpleFoam" and "icoFoam", it only fails in "ImmersedBoundary" folder. Expect someone who can give me a hint. Thanks in advance. Best regards Blow is error message: [wudi-HOME:19883] *** An error occurred in MPI_Recv [wudi-HOME:19883] *** reported by process [3012755457,0] [wudi-HOME:19883] *** on communicator MPI_COMM_WORLD [wudi-HOME:19883] *** MPI_ERR_TRUNCATE: message truncated [wudi-HOME:19883] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, [wudi-HOME:19883] *** and potentially your MPI job) |
|
October 31, 2016, 03:20 |
|
#4 |
New Member
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 10 |
Got it worked in parallel eventually!
Will update later. Thanks for watching. |
|
January 24, 2017, 09:55 |
|
#5 |
New Member
Join Date: Oct 2016
Posts: 4
Rep Power: 10 |
Hi wudi,
I have the same problem... how did you fix that? |
|
January 25, 2017, 04:57 |
|
#6 |
New Member
Di Wu
Join Date: May 2016
Location: Australia
Posts: 5
Rep Power: 10 |
Hi GFarello,
It has been a few months since I post this thread. As far as I could remember, potentialIbFoam does NOT support parallel computing, you have to run it sequentially. Once you have done this, you need to modify the initial conditions in the 0 folder, and then you may run icoIbFoam in parallel. Regards |
|
Tags |
ibm, immersed boundary method, openfoam extend |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Wind turbine simulation | Saturn | CFX | 60 | July 17, 2024 06:45 |
sliding mesh problem in CFX | Saima | CFX | 46 | September 11, 2021 08:38 |
implementation of the Immersed Boundary Method | mi_cfd | Main CFD Forum | 19 | April 24, 2019 02:24 |
CFD analaysis of Pelton turbine | amodpanthee | CFX | 31 | April 19, 2018 19:02 |
Convective Heat Transfer - Heat Exchanger | Mark | CFX | 6 | November 15, 2004 16:55 |