CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2

Parallel computation using periodic boundary conditions

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 13, 2014, 18:28
Default Parallel computation using periodic boundary conditions
  #1
New Member
 
AD
Join Date: Feb 2013
Posts: 7
Rep Power: 13
AEMD is on a distinguished road
Hello,
I have been using SU2 with periodic boundary conditions and ran into a couple of problems. Can SU2 be used in parallel with periodic boundary conditions? For the same case running fine in serial, an error occurs in parallel when writing the restart file. It also seems that the partitioned grid didn't keep the SEND_RECEIVE marker ensuring the communication between periodic boundaries, and the PERIODIC_INDEX= 1 and PERIODIC_INDEX= 2 sections are all set to 0. Could you tell me if I missed a step for running such case in parallel?
Also, the restart file written by SU2 skips halo cells and all nodes from one of the periodic boundaries, which leads to the generation of a wrong Tecplot file, where the grid structure (including halo cells and both periodic boundaries) and the number of nodes (original nodes without one of the periodic boundaries) don't match.
I used SU2 for solving other problems and it worked well otherwise. Thank you!
AEMD is offline   Reply With Quote

Old   March 17, 2014, 10:19
Default
  #2
New Member
 
AD
Join Date: Feb 2013
Posts: 7
Rep Power: 13
AEMD is on a distinguished road
After many tests, it appears that the problem really comes from the SU2_DDC program which overwrites the SEND_RECEIVE markers related to the periodic boundary conditions. Older versions of SU2 such the version 2.0.3 correctly partition the grid with the periodic BC information. Partitions generated with SU2 v.2.0.3 can then be used with later versions. However, when running a computation with SU2 v3.0 in parallel employing a periodic BC, there is no problem using 2 cores, but for more cores, the following error occurs:

Fatal error in MPI_Sendrecv: Message truncated, error stack:
MPI_Sendrecv(230).................: MPI_Sendrecv(sbuf=0x10a5fd40, scount=2050, MPI_DOUBLE, dest=2, stag=0, rbuf=0x10a5ad30, rcount=2560, MPI_DOUBLE, src=5, rtag=0, MPI_COMM_WORLD, status=0x1) failed
MPIDI_CH3U_Receive_data_found(129): Message from rank 5 and tag 0 truncated; 21080 bytes received but buffer size is 20480


Does somebody have any idea about how to solve this problem?
Thank you, it would be great if this feature was fixed in next versions as it is a commonly used boundary condition.
AEMD is offline   Reply With Quote

Old   March 27, 2014, 00:41
Default
  #3
Super Moderator
 
Thomas D. Economon
Join Date: Jan 2013
Location: Stanford, CA
Posts: 271
Rep Power: 14
economon is on a distinguished road
Hi,

Thanks for the feedback on the periodic BCs. We are currently reconsidering the implementation for the periodic BCs as we begin tackling some issues related to a new high-performance strategy in the code.

In the meantime, if you are interested in modifying the code to alleviate the issue, please let us know, and we can point you in the right direction. The decomposition is controlled by the CDomainGeometry class within geometry_structure.cpp.

Cheers,
Tom
economon is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Radiation interface hinca CFX 15 January 26, 2014 18:11
periodic boundary conditions fro pressure Salem Main CFD Forum 21 April 10, 2013 01:44
periodic boundary conditions mranji1 Main CFD Forum 4 August 25, 2009 00:45
New topic on same subject - Flow around race car Tudor Miron CFX 15 April 2, 2004 07:18
multispieces and periodic boundary conditions prob ydzhang FLUENT 4 May 25, 2001 00:17


All times are GMT -4. The time now is 10:49.