|
[Sponsors] |
September 9, 2009, 23:31 |
About chtMultiRegionFoam in parallel (v1.6)
|
#1 |
New Member
Zheng.Zhi
Join Date: Jul 2009
Location: LanZhou China
Posts: 10
Rep Power: 17 |
When I use chtMultiRegionFoam to solve multiRegionHeater case in parallel, I want to set topAir_to_heater with kqRWallFunction and epsilonWallFunction boundary conditions. But there was the error said topAir_to_heater must be wall instead of directMappedWall. Is there any method to solver this problem?
Thanks! |
|
September 10, 2009, 05:37 |
|
#2 |
Senior Member
Mattijs Janssens
Join Date: Mar 2009
Posts: 1,419
Rep Power: 26 |
Do you get this error running non-parallel as well?
|
|
September 10, 2009, 06:14 |
|
#3 |
New Member
Zheng.Zhi
Join Date: Jul 2009
Location: LanZhou China
Posts: 10
Rep Power: 17 |
Yes,maybe the problem is the wall function needs wallFvPatch , but the directMappedWall is wallPolyPatch ?
|
|
September 10, 2009, 08:25 |
|
#4 |
Senior Member
Mattijs Janssens
Join Date: Mar 2009
Posts: 1,419
Rep Power: 26 |
directMappedWall is derived from wallPolyPatch. The problem was that the finite volume equivalent wasn't derived from wallFvPatch. I pushed a fix to 1.6.x. Give it a go.
Thanks for reporting. |
|
September 15, 2009, 07:00 |
|
#5 |
New Member
Xinyuan FAN
Join Date: Sep 2009
Location: Beijing
Posts: 13
Rep Power: 17 |
Here is another problem when I tried 1.6.x,
Solving for fluid region bottomAir diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 DILUPBiCG: Solving for Ux, Initial residual = 2.211858e-16, Final residual = 2.211858e-16, No Iterations 0 DILUPBiCG: Solving for Uy, Initial residual = 1, Final residual = 6.771384e-09, No Iterations 22 DILUPBiCG: Solving for Uz, Initial residual = 1.036037e-15, Final residual = 1.036037e-15, No Iterations 0 DILUPBiCG: Solving for h, Initial residual = 0.6044857, Final residual = 8.30665e-09, No Iterations 40 Min/max T:300 300 GAMG: Solving for p, Initial residual = 1, Final residual = 0.005459487, No Iterations 2 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 time step continuity errors (bottomAir): sum local = 0.001237545, global = 4.109187e-06, cumulative = 4.109187e-06 GAMG: Solving for p, Initial residual = 0.6006277, Final residual = 6.905954e-09, No Iterations 27 diagonal: Solving for rho, Initial residual = 0, Final residual = 0, No Iterations 0 time step continuity errors (bottomAir): sum local = 2.018032e-09, global = -4.377309e-11, cumulative = 4.109143e-06 #0 Foam::error:rintStack(Foam::Ostream&) in "/home/fxy/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libOpenFOAM.so" #1 Foam::sigFpe::sigFpeHandler(int) in "/home/fxy/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libOpenFOAM.so" #2 ?? in "/lib64/libc.so.6" #3 Foam::compressible::RASModels::epsilonWallFunction FvPatchScalarField::updateCoeffs() in "/home/fxy/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libcompressibleRASModels.so" #4 Foam::compressible::RASModels::kEpsilon::correct() in "/home/fxy/OpenFOAM/OpenFOAM-1.6.x/lib/linux64GccDPOpt/libcompressibleRASModels.so" #5 main in "/home/fxy/OpenFOAM/OpenFOAM-1.6.x/applications/bin/linux64GccDPOpt/chtMultiRegionFoam" #6 __libc_start_main in "/lib64/libc.so.6" #7 _start at /usr/src/packages/BUILD/glibc-2.9/csu/../sysdeps/x86_64/elf/start.S:116 Floating point exception Then I checked epsilonWallFunctionFvPatchScalarField and found there are some calculations like this const scalarField& y = rasModel.y()[patch().index()]; forAll(nutw, faceI) { ... epsilon[faceCellI] = Cmu75*pow(k[faceCellI], 1.5)/(kappa_*y[faceI]); ... } So I modified chtMultiRegionFoam to output rasModel.y() of patch like bottomAir_to_heater, then I found all of them are 0, so the above error happened. Is there any method to solver this problem? Thanks. |
|
September 15, 2009, 11:47 |
|
#6 |
Senior Member
Mattijs Janssens
Join Date: Mar 2009
Posts: 1,419
Rep Power: 26 |
I've just pushed a lot of changes to 1.6.x to do with wall-type recognition. Could you try again?
Thanks, Mattijs |
|
September 15, 2009, 22:50 |
|
#7 |
New Member
Xinyuan FAN
Join Date: Sep 2009
Location: Beijing
Posts: 13
Rep Power: 17 |
Ok, I will give a reply when I try again.
|
|
September 16, 2009, 02:46 |
|
#8 |
New Member
Xinyuan FAN
Join Date: Sep 2009
Location: Beijing
Posts: 13
Rep Power: 17 |
The problem has been solved. The multiRegionHeater case can be running in both non-parallel and parallel. Thank you for your update.
|
|
October 14, 2009, 04:39 |
|
#9 |
New Member
Carel
Join Date: Mar 2009
Posts: 5
Rep Power: 17 |
Hi,
Can someone perhaps post a link to the tutorial for chtMultiRegionFoam in 1.6? I tried the one from 1.5 but get the following error when I run the solver: Selecting thermodynamics package hPsiThermo<pureMixture<constTransport<specieThermo <hConstThermo<perfectGas>>>>> Not Implemented Trying to construct an genericFvPatchField on patch bottomAir_to_rightSolid of field h#0 Foam::error:rintStack(Foam::Ostream&) at /opt/Op enFOAM/r1.6/debug/OpenFOAM-1.6/src/OSspecific/POSIX/printStack.C:203 I read on another thread that this may be due to the boundary type. In this case the type is solidWallTemperatureCoupled. Was this changed in 1.6? Regards Carel |
|
October 14, 2009, 09:20 |
|
#10 |
New Member
Carel
Join Date: Mar 2009
Posts: 5
Rep Power: 17 |
..never mind, just did not read the instructions!
It is working now. |
|
November 16, 2009, 13:46 |
|
#11 |
Member
Tobias Holzinger
Join Date: Mar 2009
Location: Munich, Germany
Posts: 46
Rep Power: 17 |
Hello Mattijs,
are there other BC's like directMappedWall to make two regions interact? Thanks Tobias |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Parallel Moving Mesh Bug for Multi-patch Case | albcem | OpenFOAM Bugs | 17 | April 29, 2013 00:44 |
Script to Run Parallel Jobs in Rocks Cluster | asaha | OpenFOAM Running, Solving & CFD | 12 | July 4, 2012 23:51 |
HP MPI warning...Distributed parallel processing | Peter | CFX | 10 | May 14, 2011 07:17 |
IcoFoam parallel woes | msrinath80 | OpenFOAM Running, Solving & CFD | 9 | July 22, 2007 03:58 |
Parallel Computing Classes at San Diego Supercomputer Center Jan. 20-22 | Amitava Majumdar | Main CFD Forum | 0 | January 5, 1999 13:00 |