|
[Sponsors] |
August 15, 2018, 17:03 |
Wrong result for mass flux in parallel
|
#1 |
New Member
Join Date: Nov 2016
Posts: 15
Rep Power: 10 |
Hi folks,
my current case (Channel Flow, OF 2.1.x) works fine except one point: I calculate the mass flux during run time via phi. Works fine in single core but in parallel the result is wrong. The code looks like: Code:
scalar inletFlux = 0; label inletPatch = mesh.boundaryMesh().findPatchID("inlet"); inletFlux = -1.0*gSum(phi.boundaryField()[inletPatch]); Any guess, idea etc. what could be the problem? Do you need more information? Last edited by mitti; August 29, 2018 at 16:43. |
|
August 19, 2018, 10:02 |
|
#2 |
Senior Member
Michael Alletto
Join Date: Jun 2018
Location: Bremen
Posts: 616
Rep Power: 16 |
can you provide a bit more information:
What mass flow is calculated in each processor and with is calculated by summing it over the processors. findPatchId gives back -1 if it does not find any patch with the name specified. I'm not sure what happens in a decomposed case when the patch looked for is not on a processor and you do the gsum. did you try to sum the mass flux phi with the function sum only in case findPatchId gives back a number different than -1 and after that the results over all processors |
|
August 20, 2018, 18:05 |
|
#3 | ||
New Member
Join Date: Nov 2016
Posts: 15
Rep Power: 10 |
Hi Michael,
thank you for your reply. Quote:
The fun part is, that the mass flux is different for the number of processors: 4 processors: . With values in three processors, one is zero. 8 processors: . (This is something I really don’t understand) 16 processors: but only one processor has a value, all others are zero. 32 processors: with different values in six processors. This weird results could be because I use scotch? I will try different decompose methods. Quote:
Code:
scalar inletFlux = 0; label inletPatch = mesh.boundaryMesh().findPatchID("inlet"); if (inletPatch == -1) { Info << "Failure to find patch for mass flow calc" << endl; } else { inletFlux = gSum(phi.boundaryField()[inletPatch]); inletFlux *= -1.0; } Code:
scalar inletFlux = 0; label inletPatch = mesh.boundaryMesh().findPatchID("inlet"); if (inletPatch == -1) { Info << "Failure to find patch for mass flow calc" << endl; } else { inletFlux = sum(phi.boundaryField()[inletPatch]); } reduce(inletFlux, sumOp<scalar>()); inletFlux *= -1.0; |
|||
August 22, 2018, 04:42 |
|
#4 |
Senior Member
Michael Alletto
Join Date: Jun 2018
Location: Bremen
Posts: 616
Rep Power: 16 |
That's strange. What are the initial and boundary conditions? Do you have a constant pressure gradient which drives the flow? Did you try a newer version of OF?
|
|
August 22, 2018, 04:54 |
|
#5 |
Senior Member
Michael Alletto
Join Date: Jun 2018
Location: Bremen
Posts: 616
Rep Power: 16 |
By the way, there is a function object already available in OF which determines the mass flow at a specific patch. See how to calculate mass flow rate /simpleFoam.
Maybe you can compere your output with the output of the function object and this can give you a hint what's going wrong. |
|
August 23, 2018, 13:31 |
|
#6 | |
New Member
Join Date: Nov 2016
Posts: 15
Rep Power: 10 |
Quote:
Anyway, calcMassFlow is only single core and it gives me the same result as I get for single core. |
||
August 23, 2018, 13:36 |
|
#7 | |
New Member
Join Date: Nov 2016
Posts: 15
Rep Power: 10 |
Quote:
I do have a pressure gradient which drives the flow. It is added into the momentum equation. Not yet, will do it today and keep you updated. |
||
August 28, 2018, 19:11 |
|
#8 |
New Member
Join Date: Nov 2016
Posts: 15
Rep Power: 10 |
Finally, I found enough time to give it a shoot with of1606 and 1712. It is the same result for both versions, the only major difference is that it works for 2 and 4 processors but as soon as I use 8 or more, it fails again. This time with different results for each processor than in of2.1.x.
However, while reading a bunch of stuff here, I came across this thread and the statement "The main problem could be that faces that are on a processor boundary could be counted twice (or not at all)". Also, in the wiki for calcMassFlow it's stated that the calculation of the mass flux in parallel is an issues. Nevertheless, all threads are relatively old. |
|
August 29, 2018, 07:44 |
|
#9 | |
Senior Member
Michael Alletto
Join Date: Jun 2018
Location: Bremen
Posts: 616
Rep Power: 16 |
Quote:
I just had a look on the initialization and it seems you used fixed value boundary conditions for nut, epsilon and k. I would rather advise you to use fixed value for k and epsilon and calculated for nut, or use wall function for all three quantities. Otherwise you get some inconsistency for the boundary conditions. A and you do you specify the pressure gradient which drives the flow? Is it specified in the fvOption? If so can you post it too so I can have a look? |
||
August 29, 2018, 13:30 |
|
#10 | ||
New Member
Join Date: Nov 2016
Posts: 15
Rep Power: 10 |
Quote:
Since I use a low-Re model, wall functions are not necessary. Quote:
Code:
fvVectorMatrix UEqn ( fvm::ddt(U) + fvm::div(phi, U) + turbulence->divDevReff(U) == vector(1,0,0) * gradP ); UEqn.relax(); if (momentumPredictor) { solve(UEqn == -fvc::grad(p)); } Code:
dimensionedScalar uTau = ReTau*nu/height; dimensionedScalar gradP = pow(uTau,2)/height; |
|||
August 29, 2018, 16:39 |
|
#11 | |
New Member
Join Date: Nov 2016
Posts: 15
Rep Power: 10 |
Quote:
Nevertheless, I found the solution for my problem with further and deeper digging in the forum finally in this thread. The problem is, as I already figured out, the way how openFoam parallelizes: the simple fact that a patch can also be the processor boundary, can cause problems when summing up values on this patch. See both links for further information. However, with "forcing" the decomposePar-method that all boundaries are on one processor (with preservePatches in the decomposeParDict) this problem is avoided. The impact of a maybe bad decomposition, which could effect the runtime, for my case is another aspect. |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
udf error | srihari | FLUENT | 1 | October 31, 2016 15:18 |
Udf for moving heat flux in 2D cylindrical geometry | devia21 | Fluent UDF and Scheme Programming | 0 | April 20, 2015 01:27 |
An error has occurred in cfx5solve: | volo87 | CFX | 5 | June 14, 2013 18:44 |
Wrong result with bilinear interpolation | zonexo | Main CFD Forum | 1 | June 12, 2007 16:55 |
laminar pipe flow? Fluent gives wrong result | SAM | FLUENT | 2 | November 5, 2004 02:39 |