|
[Sponsors] |
FE-3.1 Parallel issues with BCs (gMax, reduce...) |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
July 24, 2014, 16:40 |
FE-3.1 Parallel issues with BCs (gMax, reduce...)
|
#1 |
Senior Member
Pablo Higuera
Join Date: Jan 2011
Location: Auckland
Posts: 627
Rep Power: 19 |
Dear all,
I have submitted a new bug report in FOAM-extend: https://sourceforge.net/p/openfoam-e...ndrelease/251/ After releasing IHFOAM I realized that there is some trouble running in parallel, as MPI thinks it runs out of buffer memory (MPI_ERR_TRUNCATE: message truncated). This is not the case, as MPI_BUFFER_SIZE is 20000000, the same as for other versions in which no problem arises. I then narrowed down the problem and discovered that it is triggered when calling parallel functions: gMax, gMin, reduce... However, this problem is not always present, it is dependent on which field the boundary condition is applied to. The same BC applied to pressure works, but when applied to alpha1 it crashes. I have created a small example that reproduces the problem: - customZeroGrad: a custom zero gradient BC with some parallel operations and Info statements. (Sorry for the vast amount of defined-but-unused variables, as I created the BC from the wave generation one). - cavity_new: a very simple case derived from the cavity tutorial, ready to run with icoFoam and interFoam The way to reproduce is as follows: - Compile customZeroGrad: Code:
cd customZeroGrad ./localMake Code:
cd cavity_new ./runParallelIcoFoam ./cleanCase ./runParallelInterFoam_OK ./cleanCase ./runParallelInterFoam_FAIL The second case is also OK. The BC is applied to pd. The third case fails. The BC is applied to alpha1. The error message is: Code:
[user:PID] *** An error occurred in MPI_Recv [user:PID] *** on communicator MPI_COMM_WORLD [user:PID] *** MPI_ERR_TRUNCATE: message truncated [user:PID] *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort ---------------------------------------------------------- First, when the case runs, gMin and gMax do not work as expected. The minimum and maximum points of the patch are calculated in two different ways: with a loop and a reduce statement, and the result is right (minX = 0.1 and maxX = 0.2). And then with gMin and gMax a warning appears: Code:
--> FOAM Warning : From function max(const UList<Type>&) in file ~/foam/foam-extend-3.1/src/foam/lnInclude/FieldFunctions.C at line 321 empty field, returning zero Code:
GM Procesador 1: xMin 0, xMax 0.2 GM Procesador 0: xMin 0, xMax 0.2 GM Procesador 2: xMin 0, xMax 0.2 GM Procesador 3: xMin 0, xMax 0.2 ---------------------------------------------------------- The second bug is the failure when the parallel functions are called for the alpha1 field, as they have proven to work elsewhere... Best regards, Pablo |
|
August 1, 2014, 04:29 |
|
#2 |
Senior Member
Kyle Mooney
Join Date: Jul 2009
Location: San Francisco, CA USA
Posts: 323
Rep Power: 18 |
At OFW9 Hrv mentioned spotting a rather drastic bug in a few parallel aspects of the VOF code. He didn't go into much detail but perhaps you two have discovered the same problem! Regardless this this should be something we can sort out; a proper mpi reduce isn't the most complicated thing out there.
Cheers, Kyle |
|
August 1, 2014, 07:22 |
|
#3 |
Senior Member
Pablo Higuera
Join Date: Jan 2011
Location: Auckland
Posts: 627
Rep Power: 19 |
Hi Kyle,
I also think the issue I reported is pretty straightforward to fix. I have been doing some tests and apparently only two minor changes are needed: /foam-extend-3.1/src/foam/fields/Fields/Field/FieldFunctions.C from line 310: Code:
template<class Type> Type max(const UList<Type>& f) { if (f.size()) { Type Max(f[0]); TFOR_ALL_S_OP_FUNC_F_S(Type, Max, =, max, Type, f, Type, Max) return Max; } else { /* WarningIn("max(const UList<Type>&)") << "empty field, returning zero" << endl; return pTraits<Type>::zero; */ return pTraits<Type>::min; } } TMP_UNARY_FUNCTION(Type, max) template<class Type> Type min(const UList<Type>& f) { if (f.size()) { Type Min(f[0]); TFOR_ALL_S_OP_FUNC_F_S(Type, Min, =, min, Type, f, Type, Min) return Min; } else { /* WarningIn("min(const UList<Type>&)") << "empty field, returning zero" << endl; return pTraits<Type>::zero; */ return pTraits<Type>::max; } } TMP_UNARY_FUNCTION(Type, min) These modifications work for me, however, I don't know if they have any side effects, as Bernhard pointed out here http://www.cfd-online.com/Forums/ope...llel-runs.html , but I believe that they shouldn't. If this is the solution, one bug less! Best, Pablo |
|
Tags |
bug, foam extend 3.1, gmax, gmin, parallel |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Issues running custom code in parallel | BigBlueDart | OpenFOAM Programming & Development | 4 | October 23, 2013 07:17 |
Issues with running in parallel OF-2.2.0 | kalyangoparaju | OpenFOAM | 0 | June 21, 2013 13:41 |
Installation issues for parallel computation | Akash C | SU2 Installation | 1 | June 21, 2013 06:26 |
Basic Parallel Communication Issues | deepsterblue | OpenFOAM Programming & Development | 1 | July 6, 2009 23:25 |
Windows 64-bit, Distributed Parallel Run Issues... | Erich | CFX | 3 | March 28, 2006 17:36 |