|
[Sponsors] |
August 23, 2009, 09:54 |
|
#21 | |
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Quote:
Best,
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
||
August 23, 2009, 10:15 |
|
#22 |
Senior Member
Sandy Lee
Join Date: Mar 2009
Posts: 213
Rep Power: 18 |
||
August 23, 2009, 10:30 |
|
#23 |
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
I'm not an advisor, which is not easy if you want to do it right.
I was very very lucky during my studies and now in my post-doc to have good advisors though!
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
|
August 24, 2009, 07:45 |
|
#24 |
Senior Member
Sandy Lee
Join Date: Mar 2009
Posts: 213
Rep Power: 18 |
Hi lakeat, why you need so many grids to simulate a cylinder flow. I ever read a paper to just use 50000 grids to simulate a hydraufoil with the Kunz's cavitation model. How could they get it? You think.
|
|
August 24, 2009, 07:54 |
|
#25 |
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Hi Sandy,
I'm not sure how Lakeat is trying to simulate the flow, but if the Reynolds number is high enough (and it does not need to be very high), and he is doing LES, that number of cells does not seem huge to me. Keep in mind that in LES you must resolve scales until well inside the inertial subrange of the spectrum. Best,
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
|
August 24, 2009, 08:02 |
|
#26 |
Senior Member
Sandy Lee
Join Date: Mar 2009
Posts: 213
Rep Power: 18 |
WOW~, in 2D, it still needs so huge grids? If that, I should choose a good rest but work so hard.
|
|
August 24, 2009, 08:08 |
|
#27 | |
Senior Member
Daniel WEI (老魏)
Join Date: Mar 2009
Location: Beijing, China
Posts: 689
Blog Entries: 9
Rep Power: 21 |
Quote:
I have never do an extensive survey, but I am not sure how good it will be to use RANS for a cylinder flow?????????? Will RANS give the Cd and Cl correct enough???? But Apparently, I will not expect good results from 2D simulation. The energy cascade is completely wrong in 2D simulation. The Re (3900) is not high though, but already costs me a lot. This is LES. I am expecting a much larger mesh for my next case Re=140000, the grid number would be 7 million. You know what, someone told me that the "Bird Nest" in Beijing, certain man use just 80000 grids to do the simulation, and I was shocked, "how could he manage to do that?" "How could he get the content of TURBULENCE?"
__________________
~ Daniel WEI ------------- Boeing Research & Technology - China Beijing, China |
||
August 24, 2009, 08:24 |
|
#28 | |||
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Quote:
Quote:
Quote:
Best,
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
||||
August 24, 2009, 09:17 |
|
#29 |
Senior Member
Daniel WEI (老魏)
Join Date: Mar 2009
Location: Beijing, China
Posts: 689
Blog Entries: 9
Rep Power: 21 |
First, I want to say thank you, Alberto, and also thank you Sandy.
Okay, let me make myself clearer, the cylinder case mesh is an O-type mesh, with Nx*Ny*Nz=165*165*32, Re=3900, with benchmark of LES simulation already published that can be easily followed. To achieve Re=3900, I simply set inlet velocity 1m/s, and cylinder diameter 1m. Flow region is a circle with diameter of 15m, which means standing at the cylinder center, the distance to the inlet and out let are both 7.5m. The timeStep is set to 0.0025 to keep Courant number no larger than unity. Of course, this is full LES, a wall resolved LES, completely 3D simulation for a 2D geometry, as we can imagine from a wind tunnel section model. Periodic B.C. are applied in the front and back, I do not use convectiveOutlet for the moment. Then I need to consider: 1. To achieve the highest efficiency, how many CPUs do I need for this case? 2. To achieve the highest efficiency and accuracy, which solver should I use for p and for U? The 1st one, your experience is 8~16 processors are enough, right? The 2nd one, you recommend GAMG for p and smoothsolver for U, right? Here are some findings, some of which make me headache now, No. 1 My teacher strongly doubt my simulation result, he said the simulation time is too long, and is unacceptable. Code:
Time = 397.925 Courant Number mean: 0.0201989 max: 0.821767 smoothSolver: Solving for Ux, Initial residual = 0.000263646, Final residual = 9.52965e-08, No Iterations 2 smoothSolver: Solving for Uy, Initial residual = 0.00125789, Final residual = 4.36729e-07, No Iterations 2 smoothSolver: Solving for Uz, Initial residual = 0.00347622, Final residual = 1.45703e-06, No Iterations 2 GAMG: Solving for p, Initial residual = 0.0120711, Final residual = 0.000402602, No Iterations 2 time step continuity errors : sum local = 2.96528e-09, global = 7.05016e-12, cumulative = -3.29373e-09 GAMG: Solving for p, Initial residual = 0.000504671, Final residual = 4.92814e-05, No Iterations 3 time step continuity errors : sum local = 3.62961e-10, global = 3.00537e-12, cumulative = -3.29072e-09 ExecutionTime = 164607 s ClockTime = 167103 s Time = 397.927 Courant Number mean: 0.0202005 max: 0.821096 smoothSolver: Solving for Ux, Initial residual = 0.000263663, Final residual = 9.53374e-08, No Iterations 2 smoothSolver: Solving for Uy, Initial residual = 0.00125653, Final residual = 4.36351e-07, No Iterations 2 smoothSolver: Solving for Uz, Initial residual = 0.00347678, Final residual = 1.45956e-06, No Iterations 2 GAMG: Solving for p, Initial residual = 0.0120541, Final residual = 0.000401538, No Iterations 2 time step continuity errors : sum local = 2.95737e-09, global = 6.11715e-12, cumulative = -3.2846e-09 GAMG: Solving for p, Initial residual = 0.000502906, Final residual = 4.82624e-05, No Iterations 3 time step continuity errors : sum local = 3.5545e-10, global = 2.07349e-12, cumulative = -3.28253e-09 ExecutionTime = 164610 s ClockTime = 167106 s No. 2 Based on your experience, can you give a judge, are these figures correct? (I use pyFoam to get these figures) I mean time spent. Graph3-m.jpgGraph4-m.jpg Note also the question in the picture. Will you meet the problem of unstable simulation time (ie. ExecutionTime) since the flow is unsteady? Waiting for you,
__________________
~ Daniel WEI ------------- Boeing Research & Technology - China Beijing, China |
|
August 24, 2009, 10:23 |
|
#30 | ||||
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Quote:
In addition, what is your initial condition? If you use a uniform, not perturbed, condition it is going to take a long time to actually develop turbulence structures. Eugene de Villiers published a code to initialize a perturbed flow in a cylinder: search the forum for it. Moreover, the lenght of the system has to be enough not to feel the effect of periodic conditions (I would say this was checked in the original work). Quote:
Quote:
Quote:
In other words, you initialize a perturbed flow as said above, you run until the flow is completely developed (and this takes time, a lot!), and then you reset the averages and start averaging for a sufficient number of characteristic times to get your statistics. What does your teacher takes as reference to say the computational time is unacceptable? P.S. What are you using as pressure residual? Best, A.
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
|||||
August 24, 2009, 10:26 |
|
#31 |
Senior Member
Sandy Lee
Join Date: Mar 2009
Posts: 213
Rep Power: 18 |
I never think that GAMG should be chosen in parallel because it is so complex. U equation does not spend your CPU time too much, whatever which solver is chose. The key is P equation forever.
|
|
August 24, 2009, 13:02 |
|
#32 |
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Are you stating GAMG solvers are suitable only for serial calculations? I would not think so.
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
|
August 24, 2009, 22:00 |
|
#33 |
Member
Simon Lapointe
Join Date: May 2009
Location: Québec, Qc, Canada
Posts: 33
Rep Power: 17 |
Hi guys,
I feel there is some miscommunication between you. From what I understood of Lakeat's problem (correct me if I'm wrong), he is simulating the external flow around a cylinder, not an internal pipe flow. When he says he uses periodic conditions at front and back I think he means at both ends of the cylinder. Concerning Lakeat's concerns, 1) Many factors can affect your parallel performance. First, it depends on the computer you're using. A poor connection between the computing nodes can really slow down the calculation, causing a big difference in ExcutionTime and ClockTime. Second, if the connection is good, I've found that the OpenFOAM compilation can greatly influence the computational time. Are you using a pre-compiled OpenFOAM ? If you do, I strongly suggest you compile OpenFOAM locally on the computer you're using for your parallel computations, preferably with the local compiler (such as an Intel compiler). Concerning the decomposition, a "simple" decomposition can be used on simple geometries. If you aren't so sure how to decompose it, you should use MeTis. From my personal experience of unsteady simulations of external flows, I've seen good speedup (linear or better) with as less as 10-15k cells per processor. But that depends on the computer and the case. 2) I've used the GAMG solver for the pressure equation in many parallel cases and I think it performs well, generally faster than PCG. I don't agree with sandy. How the does fact that it is more complex make it slower ? Did you compare it with other solvers in parallel cases ? Hope this helps |
|
August 24, 2009, 22:53 |
|
#34 | ||
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Quote:
About the domain decomposition, in the case of a flow around a cylinder, I agree with Simon when he suggests to use METIS, since it becomes more complicated to do by hand. I'm a bit skeptical about 10.000 cells/processor. In my experience a good trade-off is something between 40.000 and 80.000 cells per processor (meaning core), especially if you are not in a hurry to get the results and you share resources with others ;-) Quote:
Best,
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
|||
August 24, 2009, 22:54 |
|
#35 | |||||||||||
Senior Member
Daniel WEI (老魏)
Join Date: Mar 2009
Location: Beijing, China
Posts: 689
Blog Entries: 9
Rep Power: 21 |
Quote:
Quote:
You can see from my last post: Quote:
And 1.516% is not that big, right? Quote:
Will different mpi implementations like openmpi, mpich, mvapich, make great differences on the speed? Say, if I am decomposing the case to several nodes, and if I am using just one node. And I remember wiki writes: Quote:
Quote:
Quote:
simple one: Quote:
Quote:
Quote:
And for a first try, my cylinder case mesh, 860000/15000~57 processors??!!!! Is this what you meant?? Quote:
PS: pls call me Daniel... Thank you all.
__________________
~ Daniel WEI ------------- Boeing Research & Technology - China Beijing, China |
||||||||||||
August 24, 2009, 23:14 |
|
#36 | ||
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Hi,
if you do not use a hardware-optimized implementation of the parallel libraries (HP, Intel, ...), you won't notice much difference switching between OpenMPI (which comes from LAM) and MPICH generic libraries. Performance mainly depends on how you compile them and your code. Quote:
Quote:
Best,
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
|||
August 24, 2009, 23:37 |
|
#37 | |||
Member
Simon Lapointe
Join Date: May 2009
Location: Québec, Qc, Canada
Posts: 33
Rep Power: 17 |
Quote:
Concerning the mpi implementations I'm not so sure, since with OpenFOAM I've only used OpenMPI. The best way to answer your question about the number of nodes is to try different combinations of nodes and processors per nodes. The best ratio will depend on the connection speed between the nodes, the connection speed between the processors and the usage of the processors on the different nodes you're using. Quote:
Which compiler did you use ? Are there different compilers available ? You could try using a different compiler. Quote:
Yes, this is what I meant. I've split meshes of 750k to 1M cells in 48 or 64 processors and the speedup was very good. But, this is very dependent of the computer used and the case, so I can't promise you anything...Also, as Alberto said, unless the usage of your system is low, it is much more reasonable to use a smaller number of processors (maybe 50k cells/processors). This way you'll still get decent speed and you could use a higher number of processors for cases that really need them. |
||||
August 25, 2009, 00:21 |
|
#38 | |
Senior Member
Sandy Lee
Join Date: Mar 2009
Posts: 213
Rep Power: 18 |
Quote:
In addtion, maybe I will try GAMG for p equation. In fact, GAMG is solved by ICCG or Block-ICCG (Why not is BiCG?!!! In coarse grid, all matrices are symmetric??!!! Who can explain it??) in coarse grids, and maybe solved by Guass Seidel in the fine grid? So, it should not be difficult to uderstand this method maybe is faster than PCG. WOW~, really?? GAMG just can be used in solving symmetrical matrix namely p equation?? I just find it! If that, what kinds of Mutil-Grids methods can be used to solve the asymmetrical matrix in OpenFOAM? @Sorry Daniel, I actually know nothing to parallel. |
||
August 25, 2009, 05:13 |
|
#39 |
Senior Member
Fabian Braennstroem
Join Date: Mar 2009
Posts: 407
Rep Power: 19 |
Hi Daniel,
another way to accelerate flow development is, to map fluctuations of a 'boxTurb' generated velocity field onto your initial RANS velocities. Fabian |
|
August 25, 2009, 08:51 |
|
#40 | |
Senior Member
Alberto Passalacqua
Join Date: Mar 2009
Location: Ames, Iowa, United States
Posts: 1,912
Rep Power: 36 |
Quote:
Best,
__________________
Alberto Passalacqua GeekoCFD - A free distribution based on openSUSE 64 bit with CFD tools, including OpenFOAM. Available as in both physical and virtual formats (current status: http://albertopassalacqua.com/?p=1541) OpenQBMM - An open-source implementation of quadrature-based moment methods. To obtain more accurate answers, please specify the version of OpenFOAM you are using. |
||
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
OpenFOAM - Validation of Results | Ahmed | OpenFOAM Running, Solving & CFD | 10 | May 13, 2018 19:28 |
Superlinear speedup in OpenFOAM 13 | msrinath80 | OpenFOAM Running, Solving & CFD | 18 | March 3, 2015 06:36 |
OpenFOAM Version 1.6 Released | opencfd | OpenFOAM Announcements from ESI-OpenCFD | 0 | July 27, 2009 18:55 |
user subroutine error | CFDUSER | CFX | 2 | December 9, 2006 07:31 |
user defined function | cfduser | CFX | 0 | April 29, 2006 11:58 |