|
[Sponsors] |
November 28, 2012, 13:40 |
running viewFactor model in parallel
|
#1 |
Member
Yuri Feldman
Join Date: Mar 2011
Posts: 30
Rep Power: 15 |
Hi Foamers,
Can anybody guide me through the setup process necessary for parallel running of viewFactor radiation model Many thanks, Yuri |
|
April 29, 2014, 12:18 |
viewfactors in parallel?
|
#2 |
Senior Member
Thomas Jung
Join Date: Mar 2009
Posts: 102
Rep Power: 17 |
First: Sorry for duplicate posting - I just accidentally posted the same as a reply to an older existing question - but in the OpenFoam Forum, instead of Running, Solving, CFD...
My question is, has anybody successfully run S2S radiation problems using the viewfactor method in parallel? I am failing with the file "FinalAgglom" missing in the processor subdirectories, and have no idea how to decompose or distribute that. Thanks a lot for any hint! Found out myself - but as I have seen others asking the same: The trick is to run faceAgglomerate and viewFactorsGen also in parallel, after decomposing the case, and with the same number of processes ... Last edited by tehache; April 30, 2014 at 05:05. |
|
April 30, 2014, 14:47 |
|
#3 | ||
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings Thomas,
Quote:
Quote:
Best regards, Bruno |
|||
June 19, 2021, 14:05 |
|
#4 | |
New Member
Arash
Join Date: Dec 2015
Location: vienna
Posts: 15
Rep Power: 11 |
Quote:
Dear Thomas @tehache, Dear Yuri @feldy77 I have the same problem while running the chtMultiregionSimpleFoam with radiation in parallel. after decomposing the domain with , for instance 2 cores, I launched the face faceAgglomerate and viewFactorsGen in parallel like: mpirun -np 2 faceAgglomerate -region air -dict constant/viewFactorsDict -parallel mpirun -np 2 viewFactorsGen -region air -parallel but I got the following error: Do you have any idea how to solve this problem? The error is not clear and I can not understand where the problem is? #0 Foam::error:: printStack(Foam:: Ostream&) at ??:? [1] #1 Foam::IOerror::exitOrAbort(int, bool) at ??:? [0] #1 Foam::IOerror::exitOrAbort(int, bool) at ??:? [1] #2 Foam::radiation::solidAbsorption::solidAbsorption( Foam::dictionary const&, Foam:: polyPatch const&) at ??:? [0] #2 Foam::radiation::solidAbsorption::solidAbsorption( Foam::dictionary const&, Foam:: polyPatch const&) at ??:? [0] #3 Foam::radiation::wallAbsorptionEmissionModel::addd ictionaryConstructorToTable<Foam::radiation::solid Absorption>::New(Foam::dictionary const&, Foam:: polyPatch const&) at ??:? [0] #4 Foam::radiation::wallAbsorptionEmissionModel::New( Foam::dictionary const&, Foam:: polyPatch const&) at ??:? [0] #5 Foam::radiation:: opaqueDiffusive:: opaqueDiffusive(Foam::dictionary const&, Foam:: polyPatch const&) at ??:? [0] #6 Foam::radiation::boundaryRadiationPropertiesPatch: :adddictionaryConstructorToTable<Foam::radiation:: opaqueDiffusive>::New(Foam::dictionary const&, Foam:: polyPatch const&) at ??:? [0] #7 Foam::radiation::boundaryRadiationPropertiesPatch: :New(Foam::dictionary const&, Foam:: polyPatch const&) at ??:? [0] #8 Foam::radiation::boundaryRadiationProperties::boun daryRadiationProperties(Foam::fvMesh const&) at ??:? [0] #9 Foam::radiation::boundaryRadiationProperties const& Foam::MeshObject<Foam::fvMesh, Foam::GeometricMeshObject, Foam::radiation::boundaryRadiationProperties>::New <>(Foam::fvMesh const&) at ??:? [0] #10 Foam::radiation::viewFactor::calculate() at ??:? [0] #11 Foam::radiation::radiationModel::correct() at ??:? [0] #12 ? at ??:? [1] #13 __libc_start_main in /lib/x86_64-linux-gnu/libc.so.6 [1] #14 at ??:? [0] #13 __libc_start_main? in /lib/x86_64-linux-gnu/libc.so.6 [0] #14 ? at ??:? -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
error while running in parallel using openmpi on local mc 6 processors | suryawanshi_nitin | OpenFOAM | 10 | February 22, 2017 22:33 |
Running mapFields with Parallel Source and Parallel Target | RDanks | OpenFOAM Pre-Processing | 4 | August 2, 2016 06:24 |
running OpenFoam in parallel | vishwa | OpenFOAM Running, Solving & CFD | 22 | August 2, 2015 09:53 |
Problems running in parallel - missing controlDict | Argen | OpenFOAM Running, Solving & CFD | 4 | June 7, 2012 04:50 |
Running in parallel crashed | zhajingjing | OpenFOAM | 4 | September 15, 2010 08:12 |