|
[Sponsors] |
[snappyHexMesh] SnappyHex on a decomposed case |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
March 13, 2013, 04:06 |
SnappyHex on a decomposed case
|
#1 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
Hi everybody,
this is my first post, so I introduce myself: I`m an engineering student attending the last year of master degree at the Politecnico of Milan. I am having some troubles with the parallel computing of snappyHex. I need to use both snappyHexMesh and decomposePar. If I run snappyHex and then decomposePar there are not problem. In the boundary condition of p,U etc I`ve already inserted the XXX_patch0 that snappy hex will create. So after I`ve "snapped" it, I can decompose it and solve it. Instead if I want to run snappyHex on multiple cores I have to decompose the case first. But the files p,U etc in the folders processorX "lose" the boundary conditions XXX_patch0. So, when i do the snappyHex and then i lauch the case, it gives me the error keyword spires_patch0 is undefined in dictionary "/[...]/B0fine/processor0/0/p::boundaryField" while it is keyword spires_patch0 is undefined in dictionary "/[...]/B0fine/0/p::boundaryField" how can i say to decomposePar to keep those boundary condition even if it doesn-t sees them in the blockmesh boundary file? thank you very much Last edited by Pj.; March 14, 2013 at 23:34. |
|
March 13, 2013, 06:47 |
|
#2 |
New Member
Join Date: Feb 2012
Posts: 6
Rep Power: 14 |
Hi Luca,
the problem is, after the meshing with snappyHexMesh there are no information about your stl patches within the processor directories (processor0, processor1, ...) yet. So what you can do is reconstruct the mesh back into one master mesh, remove the processor folders and decompose again. After these commands it should run without any problems: blockMesh decomposePar mpirun --hostfile YOURHOSTFILE -np ... snappyHexMesh -overwrite -parallel reconstructParMesh -constant rm -r processor* decomposePar mpirun --hostfile YOURHOSTFILE -np ... simpleFoam -parallel Best regards, Chris |
|
March 13, 2013, 09:02 |
|
#3 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
I'll try this tomorrow, but I don't understand: if during the first decomposePar the boundary conditions on XXX_patch0 get lost, when does it get recovered in the proces you showed me? Is it the "-constant" option of reconstructParMesh?
|
|
March 13, 2013, 12:46 |
|
#4 |
New Member
Join Date: Feb 2012
Posts: 6
Rep Power: 14 |
the mesh that first gets decomposed is the one created by blockMesh. SO no patch exists other than specified in your boundaryDict. And consequently during decomposePar on this background mesh there is no need for decomposePar to copy any initial boundary information for patches that do not exist yet.
After snappyHexMesh the stl surface is included as a patch, but still without boundary information in 0 for the new patch. |
|
March 13, 2013, 23:04 |
|
#5 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
Oh... so during the decomposePar it looks if the boundary conditions in 0/* are useful, since it doesn't sees any XXX_patchX it discards those boundary condition.
The second time it reads again the boundary conditions in 0/* and since now those patches exists it keeps them. Am I correct? Anyway I'm trying right now. I will see... |
|
March 14, 2013, 23:34 |
|
#6 |
Member
Luca
Join Date: Mar 2013
Posts: 68
Rep Power: 13 |
I did what you said and it worked perfectly.
I`ve to say that I`m a bit surprised that in the decomposeParDict there isn`t an option to say it to keep all boundary conditions that appear in the 0 forlder fields, even those that refer to patches that are not (yet) in the constant/polyMesh/boundary file. That would save all this kind of work around. Anyway the problem has been solved so everythig is fine. Thank you very much for the very fast help, Bye |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
sample decomposed case | Phicau | OpenFOAM Post-Processing | 1 | August 17, 2018 20:33 |
Is Playstation 3 cluster suitable for CFD work | hsieh | OpenFOAM | 9 | August 16, 2015 15:53 |
Simple channel case using cyclicAMI will not converge | cbcoutinho | OpenFOAM Running, Solving & CFD | 3 | August 4, 2015 13:28 |
[OpenFOAM] ParaView 4.10 and OpenFOAM 2.3.0 Multiregion and decomposed case | romant | ParaView | 3 | April 7, 2014 16:42 |
Error reading new case | montag dp | FLUENT | 5 | September 15, 2011 07:00 |