CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Error forAll cycle in parallel

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   October 26, 2020, 10:50
Default Error forAll cycle in parallel (and interFoam diffusion correction)
  #1
New Member
 
Giovanni Farina
Join Date: Nov 2019
Location: Trieste, Italy
Posts: 11
Rep Power: 7
the_ichthyologist is on a distinguished road
Summary of the thread:
Premise: I had an unwanted concentration diffusion in interFoam from one phase (water) to another (air). There are different methods to solve this issue, the one that best worked for me is cycling over cells at and above interface and adding concentration back to water phase (in my case along the z-direction)


Main problem of the thread: the code does not work in parallel, so this is relevant in general for using forAll in parallell


Solution:

  • I decomposed the mesh manually so that I could access the cells in the z-direction in the relevant domain without changing processor; it's important that you don't access cells outside your subdomain. Indexing is relative to the subdomain so the max index is actually the number of cells of the subdomain ( minus one). More details here: Manual decomposition using setFields


  • I couldn't understand what was happening because only the master processor can give output during the run. The error was hidden and I was not aware of it, it took me a lot of time to realise it. So for debugging my suggestions is to have the master processor (processor0) subdomain in a region relevant for the forAll cycle, printing indices and Infos so you can actually see what's happening


  • I used a Pstream condition to make sure that I'm in the right processors, but I don't think this is necessary. You can also send data to master processor for global operations but I didn't need to do this.

Code:
if (Pstream::myProcNo() == 0 || Pstream::myProcNo() == 18){ // blabla }

  • Then I changed my code to simply:

Code:
forAll(conc,i)  
        {
                if (mesh.C()[i][0]>1.805 && mesh.C()[i][2]>0.16  && alpha1[i]<0.49 && mesh.C()[i][2]<0.19) // the region in which I want to make the correction: air and interface
                {
                          label neighbourCount=mesh.cellCells()[i].size(); // counting neighbouring cells
                          labelList neighbours=mesh.cellCells()[i]; // list of neighbouring cells
 
                                   if ((conc[neighbours[0]]*mesh.V()[neighbours[0]]+conc[i]*mesh.V()[i])/(mesh.V()[neighbours[0]])<1) // can't get more than initial concentration, which is one
                                    {
                                       conc[neighbours[0]]=(conc[neighbours[0]]*mesh.V()[neighbours[0]]+conc[i]*mesh.V()[i])/(mesh.V()[neighbours[0]]); // add conc below
                                       conc[i]=0; // remove conc above
                                     }

                 }                              
            }
As I realised that the first neighbouring cell (j=0) in the list is the one below and knowing this I don't need to loop otside the processor subdomain.









--------------------------------------------------------------------------

Hi Foamers,
I have a diffusion problem in interFoam, my issue is that concentration diffuses from water to air. I tried different solutions ( I already have the diffusion coefficient dependant of alpha) and at the end I opted for a drastic one: reassigning concentration from air cells to cells immediately below. I thougth this would work in parallel (in serial it works perfectly fine) as usually subdomains are not divided vertically. This was true for some subdivision and not for others, therefore I implemented a manual decomposition so that the issue would not arise.


However, I do get error when running in parallel, but not when the correction is made, just at the subsequent timeStep, as if the volScalarField reassigning made the data unreadable. Or maybe it's something else.

If I do the correction at each writeTime (an integer number of seconds) I get strange errors like "could not find nut" (turbulent viscosity) in the processor18/1 folder; which look unrelated but they are not. If I make the correction on a portion of the domain which belongs only to one processor I don't get any errors, also if the portion is in two adjacent subdomains, even in four, it looks like there is some specific processor subdomain pairing in which the problem arises. Maybe there is some issue with the shared cells between subdomains and writing/reading fields there.


Edit: If I make the correction on either half of the desired region it works, but then if I want to apply the correction to the whole region I get the error, even if I do one half first and then the other one.



If anyone wants to suggest some other kind of correction I tried setting a fictional downward wind in the concentration equation but it was just deleting concentration, while I want to conserve concentration mass.


The error is the usual printstack error


Code:
[20] #0  Foam::error::printStack(Foam::Ostream&)[28] #0  Foam::error::printStack(Foam::Ostream&) at ??:?
[20] #1  Foam::sigFpe::sigHandler(int) at ??:?
[28] #1  Foam::sigFpe::sigHandler(int) at ??:?
[28] #2  ? at ??:?
[20] #2  ? in "/lib64/libc.so.6"
[28] #3  ? in "/lib64/libc.so.6"
[20] #3   at ??:?
[28] #4  __libc_start_main? in "/lib64/libc.so.6"
[28] #5  ? at ??:?
[20] #4  __libc_start_main at ??:?
 in "/lib64/libc.so.6"
[20] #5
The code looks something like this for the correction (it works because I have a very nice structured mesh and I don't use AMR):
Code:
  forAll(conc,i)
              {
                   if (mesh.C()[i][0]>1.80501 && mesh.C()[i][2]>0.161 && alpha1[i]<0.49 && mesh.C()[i][2]<0.185)  // 
                {
                         for (label j=1; j<=100000; j++)
                        {
                           if (mesh.C()[i-j][0]==mesh.C()[i][0] && mesh.C()[i-j][1]==mesh.C()[i][1] && mesh.C()[i-j][2]<mesh.C()[i][2])
                             {
                               index=j;
                               break;
                              }
                        }

                         if ((conc[i-index]*mesh.V()[i-index]+conc[i]*mesh.V()[i])/(mesh.V()[i-index])<1)
                         {
                           conc[i-index]=(conc[i-index]*mesh.V()[i-index]+conc[i]*mesh.V()[i])/(mesh.V()[i-index]);  
                           conc[i]=0;
                         }
                     }
                  }

 

Info<< "Finished solving now for concentration\n" << endl;
        }
    runTime.write();

        Info<< "ExecutionTime = " << runTime.elapsedCpuTime() << " s"
            << "  ClockTime = " << runTime.elapsedClockTime() << " s"
            << nl << endl;
which is executed (I read the infos), then at the begininng of the cycle I have:
Code:
  #include "readDyMControls.H"

        if (LTS)
        {
            #include "setRDeltaT.H"
        }
        else
    {
            #include "CourantNo.H"
            #include "alphaCourantNo.H"
            #include "setDeltaT.H"
        }

        runTime++;

        Info<< "Time = " << runTime.timeName() << nl << endl;
I don't read that last line so I think the issue lies in here.

The general issue is also the one addressed in:
InterFoam: Add an equation to only one phase
Adding diffusion term to interFoam transport equation
InterfaceCompression interFoam
High diffusion in interFoam

I really appreciate any kind of help and feedback.


Giovanni

Last edited by the_ichthyologist; November 4, 2020 at 07:03. Reason: Update with my solution to the proposed issue for general utility
the_ichthyologist is offline   Reply With Quote

Old   October 29, 2020, 06:53
Default
  #2
New Member
 
Giovanni Farina
Join Date: Nov 2019
Location: Trieste, Italy
Posts: 11
Rep Power: 7
the_ichthyologist is on a distinguished road
Some updates... After closer analysis of the error I found that the run exits at the beginning of the cycle, during the calculation of the Courant number, specifically at this point:

Code:
scalarField sumPhi
    (
         fvc::surfaceSum(mag(phi))().primitiveField()
    );
I really don't see the connection between my correction on the volScalarField conc and the surfaceVectorField phi.

I also noticed that this is the most computationally costly operation in terms of time execution, this, together with the fact that correction worked on either half but not on the whole, suggests me that my correction requires something more which the solver struggles doing. Maybe running through cells in parallel deletes something that needs to be calculated again, I have no clue. It could also have something to do with the way pStream handles code.

More updates:
I tried correcting each half alternatively but now it does not run anymore even on one half (I tried removing processors folders and reinitialise the case).

I tried defining a subfield scalarField& concCells = const_cast<scalarField&>(conc().field()); but nothing changes

I tried using the method mesh.cellCells() to find neighbouring cells and loop over them, but it looks even more problematic

I tried putting some tolerances for coordinates, looping in opposite direction, renumbering the mesh: nothing happens

I tried running the correction without modifying conc (thus just accessing cells) and I get the same errror: it's just accessing the cells that triggers the error in reading phi.

I tried getting some text output (I wanted to know indices) but I found that the solver does not even get in the first if. And though, it gives an error which depends on the if condition (there is where I set my "correction region boundaries").


I printed indices and coordinates: it looks like the forAll works only in the processor0 domain, thus it never satisfies the if condition. Or maybe this is just the output I can see. I think I can solve the whole issue by a proper use of pStream functions, if anyone knows them well please help me. I could also look myself if the other alternatives won't work.

How to do communication across processors


This sounds all very strange to me, it looks like the forAll cycle doesn't work but yet gives an error depending on its content. If anybody has even the slightest idea why this happens please tell me. Right now I'm getting back to the "wind" correction in the concentration equation.

Last edited by the_ichthyologist; November 1, 2020 at 05:11.
the_ichthyologist is offline   Reply With Quote

Reply

Tags
error, forall loop, parallel, printstack error


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Error running openfoam in parallel fede32 OpenFOAM Programming & Development 5 October 4, 2018 17:38
Explicitly filtered LES saeedi Main CFD Forum 16 October 14, 2015 12:58
simpleFoam parallel AndrewMortimer OpenFOAM Running, Solving & CFD 12 August 7, 2015 19:45
simpleFoam in parallel issue plucas OpenFOAM Running, Solving & CFD 3 July 17, 2013 12:30
Correct way to program a nonlinear cycle to run it in parallel davide_c OpenFOAM Programming & Development 4 March 27, 2012 10:24


All times are GMT -4. The time now is 11:31.