|
[Sponsors] |
A fatal problem of immersed boundary layer method!!! |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
November 12, 2018, 05:29 |
A fatal problem of immersed boundary layer method!!!
|
#1 |
New Member
kinakamichibun
Join Date: Nov 2018
Posts: 5
Rep Power: 8 |
When I tried to run the tutorial of refiningMovingCylinderInChannelIco of foam -extend 4.1 in parallel, it firstly aborts for two undefined keywords, they are numberOfSubdomains and method. But after these two keywords are added, it still aborts after several iterations with the following message:
Fatal error in MPI_Recv: Message truncated, error stack: MPI_Recv(200).....................: MPI_Recv(buf=0x114f910, count=49, MPI_BYTE, src=0, tag=1, MPI_COMM_WORLD, status=0x7ffd2df53490) failed MPIDI_CH3U_Receive_data_found(131): Message from rank 0 and tag 1 truncated; 51208 bytes received but buffer size is 49 This tutorial can run well in serial.Now I don't know what to do with it. Can anybody give some advice? Thank you! |
|
November 12, 2018, 06:38 |
|
#2 | |
Senior Member
Join Date: Dec 2017
Posts: 153
Rep Power: 8 |
Quote:
If you are not familiar with mpi, the error reported means that the proc0 is sending a message of a certain size but for some reason the prescribed receiver is not able get it correctly. This might be caused by an incorrect tag or an incorrect receiver id into the mpi_recv call or an incorrect memory allocation of the buffer array. Since the code comes from the extended version and the serial version is running well, there is a good probability that you have a bug in the sources... Sorry, but it is diffucult to be more helpful than this. |
||
November 12, 2018, 10:57 |
|
#3 |
Senior Member
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,750
Rep Power: 66 |
This isn't a problem of immersed boundary layer method but a how to run openfoam in parallel question. Posting the mpi error msg is not at all helpful, what you need is the actual openfoam log from the host-node (the one that looks like the same stuff you'd see in a serial message).
Are you able to run any cases of OF in parallel? I think the answer is no. In decompseParDict you have to specify numberOfSubdomains and method. Then you need to run decomposePar. After you do this you will find dirs like processor0, processor1, and so on corresponding to the same number that you specified in numberofSubdomains. Are you here yet? Then you need to launch OF in parallel mode, how to do this depends on the environment. For example, in my case, I have to invoke mpirun. |
|
November 12, 2018, 10:59 |
|
#4 | |
Senior Member
Join Date: Dec 2017
Posts: 153
Rep Power: 8 |
Quote:
|
||
November 12, 2018, 22:40 |
|
#5 | |
New Member
kinakamichibun
Join Date: Nov 2018
Posts: 5
Rep Power: 8 |
Quote:
|
||
November 13, 2018, 05:32 |
|
#6 | |
Member
Join Date: Aug 2018
Posts: 77
Rep Power: 8 |
Quote:
well, if you rule out ansinput errorson your part, it is likely a bug in the code. Just work it out likeyou would any other bug... If I had a dollar for every time I ran into an MPI issue, I would be rich. Just dig in and get dirty, or talk to support if you are not a programmer. |
||
November 13, 2018, 11:05 |
|
#7 |
Senior Member
Lucky
Join Date: Apr 2011
Location: Orlando, FL USA
Posts: 5,750
Rep Power: 66 |
What even is openfoam extend 4.1? Can you send me the github link?
|
|
November 13, 2018, 11:48 |
|
#8 |
Senior Member
Anton Kidess
Join Date: May 2009
Location: Germany
Posts: 1,377
Rep Power: 30 |
[removed...]
|
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[snappyHexMesh] Error defining boundary layer around cube snappyHexMesh | crizpi21 | OpenFOAM Meshing & Mesh Conversion | 5 | October 16, 2021 11:56 |
Problem in setting Boundary Condition | Madhatter92 | CFX | 12 | January 12, 2016 05:39 |
Radiation interface | hinca | CFX | 15 | January 26, 2014 18:11 |
Error finding variable "THERMX" | sunilpatil | CFX | 8 | April 26, 2013 08:00 |
Convective Heat Transfer - Heat Exchanger | Mark | CFX | 6 | November 15, 2004 16:55 |