|
[Sponsors] |
[swak4Foam] Bug in groovyBC in parallel computation |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
February 11, 2012, 08:15 |
Bug in groovyBC in parallel computation
|
#1 |
Member
|
Greetings, dear Foamers.
I found a bug which seems to be connected with groovyBC. If in groovyBC expressions the "max" and "min" statements are used, the parallel run of simpleFoam and pisoFoam fails. This bug is specific for OF-1.6-ext. I tried the same case in OF-2.1.0 (with some changes specific for 2.1.0 version) and anything worked OK. I attach an example case. It runs simpleFoam OK on a single core but fails in parallel simpleFoam run. Best regards, Aleksey. |
|
February 11, 2012, 09:51 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings Aleksey,
Bernhard might not fix this if it's not reported on the dedicated bug tracker for swak4Foam: http://sourceforge.net/apps/mantisbt...?project_id=10 Make sure that you pick swak4Foam as the project to where you report, since it is embedded into Extend's main project. On another note... I think I saw swak4Foam being integrated directly into 1.6-ext's git repo... have you seen or are you using that version? Best regards, Bruno
__________________
Last edited by wyldckat; February 12, 2012 at 14:44. Reason: sorry, misspelled your name... |
|
February 11, 2012, 10:30 |
|
#3 |
Member
|
Thank you for your reply, Bruno.
I've submitted a bug N 123: https://sourceforge.net/apps/mantisb...iew.php?id=123 The swak4foam was downloaded by command: svn checkout https://openfoam-extend.svn.sourcefo...ies/swak4Foam/ OF-1.6-ext I downloaded by means of: git clone git://openfoam-extend.git.sourceforge.net/gitroot/openfoam-extend/OpenFOAM-1.6-ext and there were neither swak4foam nor groovyBC. Best regards, Aleksey. |
|
February 11, 2012, 10:41 |
|
#4 | |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi Aleksey,
Quote:
Code:
git merge origin/bgschaid/feature/swak4Foam Code:
wmake all src/swak4Foam wmake all applications/utilities/swak4Foam Bruno
__________________
|
||
February 11, 2012, 11:20 |
|
#5 |
Member
|
I've checked the attached case using branched swak4foam. The same behaviour - simpleFoam works OK on single core and fails on parallel run.
Best regards, Aleksey. |
|
February 11, 2012, 15:52 |
|
#6 |
Senior Member
Philippose Rajan
Join Date: Mar 2009
Location: Germany
Posts: 552
Rep Power: 25 |
Hi and a Good Evening :-)!
This issue that you have seen is to do with the OpenFOAM-1.6-ext version of OpenFOAM. I posted this as a bug report quite a while ago under the GroovyBC topic, but after discussing it with Bernhard, it turned out to be an issue with the "-ext" version of OpenFOAM.... specifically, to do with "pTraits". However, can you give me some more details regarding the crash you are having with simpleFoam?? I use GroovyBC with simpleFoam for parallel simulations quite often. Though I get a warning that the "min" or "max" functions return a zero and that the average will be taken (cant remember the exact warning), the simulations work out fine, and the results are also good. By the way..... I am using the latest version of OpenFOAM-1.6-ext from the Git repository. Have a nice day! Philippose |
|
February 12, 2012, 08:26 |
|
#7 |
Member
|
Thank you for your reply.
Honestly, the logs are not informative. I attach the log and the terminal output. By the way, does the attached in this thread case runs on your system? Best regards, Aleksey. |
|
February 12, 2012, 14:41 |
|
#8 |
Senior Member
Philippose Rajan
Join Date: Mar 2009
Location: Germany
Posts: 552
Rep Power: 25 |
Hi again,
I just tried the case you had posted on this thread with OpenFOAM-1.6-extm and you are right..... it does not work. Initially I thought it might be because the decomposition also cut up the patches into different domains... but I dont think that is the case. Atleast, the decomposition that I got from "decomposePar" had all the patches intact in either one of the two parts. I hope this post catches Bernhard's attention..... He may have something more to say about the issue. I think he is already aware of it. Sorry I could not help further.... Have a nice day ahead :-)! Philippose |
|
February 13, 2012, 20:01 |
|
#9 |
Assistant Moderator
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51 |
There are good reasons that this branch is currently not merged into the default-branch (I just put it there for discussion). The swak-sources that are there don't differ from regular swak (so this wouldn't fix the problem discussed).
|
|
February 13, 2012, 20:04 |
|
#10 | |
Assistant Moderator
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51 |
Quote:
|
||
September 13, 2012, 03:40 |
|
#11 |
Member
Join Date: Apr 2012
Location: Trivandrum
Posts: 37
Rep Power: 14 |
Hi !
I am getting the same error in simpleFoam solver of OpenFoam1.6ext with parabolicVelocity boundary condition. My case is a simple laminar pipe flow with a constriction and I am trying to run it on an 8-core cpu. The case is tested for single and multiple cores with surfaceNormalfixedvalue bc and found working. but, when parabolicVelocity boundary condition is used, normal solution without decomposing is working fine. but, when tried to decompose it and run in parallel, it ended up with similar errors mentioned above. Time = 1 DILUPBiCG: Solving for Ux, Initial residual = 1, Final residual = 0.0105602, No Iterations 2 DILUPBiCG: Solving for Uy, Initial residual = 1, Final residual = 0.0903589, No Iterations 1 DILUPBiCG: Solving for Uz, Initial residual = 1, Final residual = 0.0875971, No Iterations 1 [entropy:2277] *** An error occurred in MPI_Recv [entropy:2277] *** on communicator MPI_COMM_WORLD [entropy:2277] *** MPI_ERR_TRUNCATE: message truncated [entropy:2277] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort) -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 2277 on node entropy exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [entropy:02276] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal [entropy:02276] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages I have swake4Foam installed, but not used in this particular case. So I think this error is not related to groovyBC. Any advice would be greatly appreciated... and thanks a lot for your time. Regards Jabir |
|
September 13, 2012, 05:36 |
|
#12 | |
Assistant Moderator
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51 |
Quote:
I think it can't be the parabolic-BC as the velocity has already been successfully solved. The problem seems to occur during the pressure solution. Are you using any AMG-type solver for that. Just for testing replace it with one of the CG-solvers. Other than that in my experience that kind of error usually occurs because of an inconsistently compiled OF-version (because of an update after which you chose only to compile some libraries for instance). Sounds like snake oil but sometimes a complete recompilation helps.
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request |
||
September 13, 2012, 08:15 |
|
#13 | ||
Member
Join Date: Apr 2012
Location: Trivandrum
Posts: 37
Rep Power: 14 |
Thanks for your quick response..
Quote:
Quote:
Code:
solvers { p { solver PCG; preconditioner DIC; tolerance 1e-06; relTol 0.01; }; U { solver PBiCG; preconditioner DILU; tolerance 1e-05; relTol 0.1; }; Thanks a lot for your time. Regards, Jabir Last edited by toolpost; September 14, 2012 at 00:01. Reason: typos |
|||
September 13, 2012, 11:56 |
|
#14 | |
Assistant Moderator
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51 |
Quote:
You mentioned the parabolic-BC before. That is the last thing I can think of. Replace that with a normal fixedValue. If the simulation then runs OK then probably the min/max in parallel is the problem
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request |
||
September 14, 2012, 00:01 |
|
#15 | |
Member
Join Date: Apr 2012
Location: Trivandrum
Posts: 37
Rep Power: 14 |
Good morning!
Quote:
Thanks for your time and consideration. Regards, Jabir |
||
September 15, 2012, 10:05 |
|
#16 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings to all!
@Jabir: Can you provide an example based on one of the tutorial cases? This way we can more easily try and replicate the problem you're getting. Best regards, Bruno
__________________
|
|
September 15, 2012, 12:45 |
|
#17 |
Member
Join Date: Apr 2012
Location: Trivandrum
Posts: 37
Rep Power: 14 |
Good evening!
Yes. I tried with pitzDaily case with laminar flow assumption and the same errors appeared again. The case file is attached. Please have a look at the 0/U file. Both inlet bcs are fine with single core simulation. But in parallel run, the parabolicvelocity crashes as said above. Thanks for your help. Regards, Jabir |
|
September 16, 2012, 05:22 |
|
#18 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi Jabir,
OK, only now did I come to the conclusion that the problem is with the "parabolicVelocity" BC itself, not groovyBC Therefore, I think this should be reported at http://sourceforge.net/apps/mantisbt...?project_id=11, namely project "OpenFOAM-ext release", along with the test case you provided! As for a quick fix, have a look at the tutorial "incompressible/simpleFoam/pitzDailyExptInlet", where you can find in the folder "constant/boundaryData/inlet/" that the values at the "inlet" are being defined for each point on the patch. It may be a bit annoying having to define them manually or with the help of another utility, but for now this would be the somewhat-quickest solution. Best regards, Bruno
__________________
|
|
September 18, 2012, 00:40 |
|
#19 | |
Member
Join Date: Apr 2012
Location: Trivandrum
Posts: 37
Rep Power: 14 |
Good morning Bruno!
Thanks a lot for your help and suggestions. So, the problem comes from parabolicVelocity bc, and not from groovyBC. I shall post a bug report in the link you provided. Quote:
Thanks again for your time and help. Regards, Jabir |
||
September 18, 2012, 08:50 |
|
#20 | |
Assistant Moderator
Bernhard Gschaider
Join Date: Mar 2009
Posts: 4,225
Rep Power: 51 |
Quote:
There is an example-dictionary that comes with swak4Foam
__________________
Note: I don't use "Friend"-feature on this forum out of principle. Ah. And by the way: I'm not on Facebook either. So don't be offended if I don't accept your invitation/friend request |
||
Tags |
bug, groovybc, parallel, pisofoam, simplefoam |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Fluent parallel computation using Xeon duo-processor with CFD occupation below 50% | zhujj09 | FLUENT | 1 | October 19, 2016 16:02 |
simpleFoam parallel | AndrewMortimer | OpenFOAM Running, Solving & CFD | 12 | August 7, 2015 19:45 |
the speed of parallel computation for over 3 million grid is very slow | liujmljm | SU2 | 7 | June 27, 2014 17:45 |
Run in parallel: bug or what? | novyno | OpenFOAM Running, Solving & CFD | 0 | November 10, 2009 11:34 |
load balancing in parallel computation | shyamdsundar | Main CFD Forum | 0 | September 3, 2009 00:53 |