|
[Sponsors] |
Problem with parallelization and a constant field located at constant folder |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
November 12, 2019, 22:41 |
Problem with parallelization and a constant field located at constant folder
|
#1 |
New Member
Join Date: Sep 2019
Posts: 18
Rep Power: 7 |
Hi Foamers,
I have the following problem: I am solving MHD with mhdFoam for a cylinder with an external constant magnetic field. I had to modify the solver to take in account this external and constant field. The idea was to split up B in one part fixed in time BFixed(r) and other variable in time and space BVble(r,t). As BFixed is the same every time step, I located it in constant folder. Otherwise it would be writing the same file always in every time directory. The solver works good but the problem is when I want to divide the work in several subdomains. I wrote decomposePar, foamJob mpirun -np 2 cylinderMhdFoam -parallel &. But, the BFixed is not parallelized for some reason: Code:
[0] [0] [0] --> FOAM FATAL ERROR: [0] cannot find file "/home/foamusr/localfolder/tfg/cylinder/processor0/constant/BFixed" [0] [0] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const [0] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0 with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [1] [1] [1] --> FOAM FATAL ERROR: [1] cannot find file "/home/foamusr/localfolder/tfg/cylinder/processor1/constant/BFixed" [1] [1] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::readStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const [1] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538. [1] FOAM parallel run exiting [1] [7cafdb3f0a70:00343] 1 more process has sent help message help-mpi-api.txt / mpi-abort [7cafdb3f0a70:00343] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Thanks in advance |
|
November 17, 2019, 16:05 |
|
#2 |
New Member
Join Date: Sep 2019
Posts: 18
Rep Power: 7 |
The problem of the parallelization with a field located in constant folder cannot be solved (easely at least). So I found out a solution much simpler: just make the solver read the field located at time folder 0 every time step. This has the same result.
|
|
July 17, 2020, 05:31 |
|
#3 | |
Member
Join Date: Sep 2018
Location: France
Posts: 62
Rep Power: 8 |
Quote:
I am not sure to understand what are you saying. I have got the same issue as the solver needs to read a field located in the constant folder which is not included when I decomposed the case. How do you force the solver to read this field ? Cheers. |
||
January 26, 2021, 23:36 |
|
#4 |
New Member
Join Date: Jan 2021
Location: Edmonton
Posts: 4
Rep Power: 5 |
Hello, I am adapting an immiscible two-phase flow solver for porous media for my own case. The solver runs fine in series but not in parallel.
Solver compilation and preliminary steps do not show any errors. Also, no error appears on "stepFields" or "decomposePar". The error appears when running the solver in parallel (please see below). I believe the error has to do with the permeability "K" (constant cell-centered scalar field) being in the "constant" folder. When I run "decomposePar" the application does not decompose the permeability "K" in the processors. Any feedback you could give me will be very appreciated. Thank you very much for your attention and all the very best. Sebastian Reading porosity field eps (if present) Reading permeability field K -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 3 in communicator MPI COMMUNICATOR 3 SPLIT FROM 0 with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- [0] [0] [0] --> FOAM FATAL ERROR: [0] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor0/constant/K" [0] [0] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const [0] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538. [0] FOAM parallel run exiting [0] [1] [1] [1] --> FOAM FATAL ERROR: [1] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor1/constant/K" [1] [1] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const [1] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538. [1] FOAM parallel run exiting [1] [2] [2] [2] --> FOAM FATAL ERROR: [2] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor2/constant/K" [2] [2] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const [2] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538. [2] FOAM parallel run exiting [2] [3] [3] [3] --> FOAM FATAL ERROR: [3] cannot find file "/home/lopezsaa/OpenFOAM/lopezsaa-8/run/porousMultiphaseFoam-openfoam-v8/tutorials/lcl-tutorials/two_phase/paSaBCAnisob5PARALLEL/processor3/constant/K" [3] [3] From function virtual Foam::autoPtr<Foam::ISstream> Foam::fileOperations::uncollatedFileOperation::rea dStream(Foam::regIOobject&, const Foam::fileName&, const Foam::word&, bool) const [3] in file global/fileOperations/uncollatedFileOperation/uncollatedFileOperation.C at line 538. [3] FOAM parallel run exiting [3] Last edited by saavedra00; January 27, 2021 at 22:51. Reason: Better describe my problem. |
|
January 28, 2021, 05:55 |
|
#5 |
Senior Member
Join Date: Dec 2019
Location: Cologne, Germany
Posts: 369
Rep Power: 8 |
did you modify anything in the solver code?
how do you create this K from the solverside? volScalarField K ( IOobject -> from where does it read? ... more input is required ... |
|
January 28, 2021, 12:12 |
|
#6 | ||
New Member
Join Date: Jan 2021
Location: Edmonton
Posts: 4
Rep Power: 5 |
Hello geth03.
After consulting experts I was able to solve the problem. The problem was that the volScalarField K was being read from the "constant" folder. I learned that one shouldn't read fields from the constant solver - they belong into the time directory, which is decomposed. So just had to modify the "createFields.H" file from this: Quote:
to this: Quote:
Thanks a lot for offering help! |
|||
January 29, 2021, 02:35 |
|
#7 |
Senior Member
Join Date: Dec 2019
Location: Cologne, Germany
Posts: 369
Rep Power: 8 |
no problem,
i assumed that K was taken from the constant folder, that is why i asked how you read it and instantiate it. anyway, glad you solved it already. |
|
January 29, 2021, 17:54 |
|
#8 |
Member
Ran
Join Date: Aug 2016
Posts: 69
Rep Power: 10 |
> runTime.constant(),
what if > runTime.monkey(), What does this line of code really do?
__________________
Yours in CFD, Ran |
|
Tags |
cannot find, foam fatal error, magnetic field, mhdfoam, parallelization |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
OpenFOAM Parallelization MPI Cluster Problem | arslan.ali | OpenFOAM Running, Solving & CFD | 4 | September 23, 2018 14:50 |
Fluent Parallelization Problem After AC Power Dropped | pawl | Hardware | 5 | November 13, 2016 07:08 |
Problem with parallelization on cluster | GiuMan | OpenFOAM Running, Solving & CFD | 12 | August 14, 2015 06:11 |