|
[Sponsors] |
April 22, 2016, 11:35 |
Caelus v6.04 released
|
#1 |
Senior Member
Chris Sideroff
Join Date: Mar 2009
Location: Ottawa, ON, CAN
Posts: 434
Rep Power: 22 |
The next major version of Caelus, 6.04, was released today. We’ll continue to stick to bi-annual releases with 6.10 coming in October. Much of the effort this release was spent overhauling interpolation schemes. Here’s a brief list:
The other news is there is now a publicly available source repository to enable better community collaboration: https://bitbucket.org/appliedccm/caelus-contributors There is a release notes PDF available on the download page with more descriptions. http://www.caelus-cml.com/download/ If you’re not already familiar with Caelus, you can read more about it on the website. Its free and open-source. The file formats are compatible with OpenFOAM but there have been many improvements to it such that it is now moving independently in its own direction. We are happy to answer questions, technical or otherwise, by email: caelus@appliedccm.com or also follow Caelus on twitter @caelus_cml Applied CCM is currently the maintainer and main developer of Caelus. If you’re interested in porting or developing your own application to Caelus, we’ll be happy to provide guidance. If you’ve developed a solver or model that you would like to include in Caelus check out the contributors repository. Any and all external contributors copyright will be kept in their contributed source files. |
|
April 26, 2016, 20:36 |
|
#2 |
Senior Member
Pei-Ying Hsieh
Join Date: Mar 2009
Posts: 334
Rep Power: 18 |
Hi,
Will the linux binary version work on openSuSe 13.2? Pei-ying |
|
April 27, 2016, 10:56 |
|
#3 |
Senior Member
Pei-Ying Hsieh
Join Date: Mar 2009
Posts: 334
Rep Power: 18 |
I installed Caelus-6.04 on OpenSUSE 13.2. The serial solver ran, but, got errors when tried to run in parallel.
I am wondering how I can get caelus to use system openmpi. It looks like caelus tried to look for openmpi-1.6.5 under /opt/caelus/caelus-4.10. Recommendations will be appreciated. Pei-Ying |
|
April 27, 2016, 16:19 |
|
#4 |
Senior Member
Chris Sideroff
Join Date: Mar 2009
Location: Ottawa, ON, CAN
Posts: 434
Rep Power: 22 |
Pei-Ying,
We don't build/test on SUSE but I have helped someone previously with issues running Caelus on SUSE. To your issue, it looks like you left the default OpenMPI (the one Caelus provides) when you installed Caelus rather than configure it to use the system MPI. Caelus is complaining because it's looking to the provided MPI libraries and you're using the system MPI executables. To fix it, follow these steps. In "lib/python2.6/Caelus/conf.py" change to the following: Code:
MPI_BIN = '/usr/lib64/mpi/gcc/openmpi/bin' MPI_LIB = '/usr/lib64/mpi/gcc/openmpi/lib64' MPI_INC = '/usr/lib64/mpi/gcc/openmpi/include' Code:
export MPI_INC=/usr/lib64/mpi/gcc/openmpi/include export MPI_LIB=/usr/lib64/mpi/gcc/openmpi/lib64 export MPI_LIB_NAME=mpi Code:
sudo ln -s /usr/lib64/mpi/gcc/openmpi/lib64/libmpi.so /usr/lib64/mpi/gcc/openmpi/lib64/libmpi.so.1 Code:
parRunTemplate "mpiexec -n %(NPROCS)d %(PAROPTS)s --mca btl sm,self %(APPLICATION)s %(ARGS)s -parallel"; Perhaps we'll look into including a SUSE distribution as a future supported platform. -Chris |
|
April 28, 2016, 08:47 |
|
#5 |
Senior Member
Pei-Ying Hsieh
Join Date: Mar 2009
Posts: 334
Rep Power: 18 |
Thanks a lot Chris!
Everything works perfectly on OpenSUSE 13.2 now. Now, I just need to figure out how to compile a new solver using scon. Pei-Ying |
|
April 28, 2016, 09:23 |
|
#6 |
Senior Member
Chris Sideroff
Join Date: Mar 2009
Location: Ottawa, ON, CAN
Posts: 434
Rep Power: 22 |
Pei-Ying,
Glad to hear its working. I replied to your email about using scons. -Chris |
|
May 11, 2016, 06:18 |
DNS Turbulence modelling in vofLPTSolver
|
#7 |
New Member
|
hi guys,
I am running the case vofLPTSolver in CAELUS-6.04. It is basically a case relating to simulation of a number of lagrangian bubble particles (lagrangian particle tracking) in continuous water phase (VOF) using LES turbulence model. But i want to simulate the same test case using DNS turbulence modelling inspite of LES approach. What modifications i have to do in the existing '...tutorials/vofLPTSolver' directory or in any other directories in CAELUS? Thanks in advance. regards, Anirban |
|
May 11, 2016, 18:25 |
|
#8 |
Senior Member
Chris Sideroff
Join Date: Mar 2009
Location: Ottawa, ON, CAN
Posts: 434
Rep Power: 22 |
Anirban,
No modifications to the solver are necessary. To run DNS means no explicit turbulence modelling (LES, RAS, etc). Therefore to disable an explicit LES model, in the constant/LESProperties dictionary change LESModel from Smagorinsky to laminar. -Chris |
|
May 17, 2016, 11:01 |
|
#9 |
New Member
|
Hi Chris,
Greetings I am running tutorials/vofLPTSolver in Caelus-6.04 changing different values and variables in some files within this directory to simulate lagrangian bubble particles rising in water with VOF free surface capturing using DNS only. But, i am confused whether the injection particles are air bubbles or water after seeing the value of alpha1 in animation. I am mentioning two cases with the details of the changing stuff i have done. Case-1: $Caelus/../ACCM_bubbleCol3D/Allrun.py- line 69: omitting '-parallel': run = subprocess.Popen(['caelus.py', '-l', 'vofLPTSolver'], shell=pltfrm) $Caelus/../ACCM_bubbleCol3D/system/controlDict- line 22: endTime 10; Case-2: same changes as in case-1. additional changes- $Caelus/../ACCM_bubbleCol3D/system/controlDict- line 22: endTime 10; line 28: writeInterval 1; $Caelus/../ACCM_bubbleCol3D/bubbleCloudproperties- line 153: changing threshold value from 0.5 to 0: threshold 0; $Caelus/../ACCM_bubbleCol3D/LESProperties- line 15: replacing Smagorinsky with laminar to switch on DNS as per your suggestion: LESModel laminar; Attaching two case animation separately what i have made. Case-1: https://www.youtube.com/watch?v=o1Uh...ature=youtu.be Case-2: https://www.youtube.com/watch?v=WJ1r...ature=youtu.be |
|
September 8, 2016, 03:01 |
vofLPT solver- solid particle tracking? works in Parallel?
|
#10 |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Hello Chris,
I want to work with vofLPTSolver to track solid particles in liquid flow. I have certain basic questions: 1.whether this solver can be run in parallel. 2. Does this solver treats fluid phase as continuum and particles as individual lagrangian identities? 3. Also, does this solver handles solid particles? Request you to kindly let me know. Thanks & Regards, Vidyadhar |
|
September 8, 2016, 15:46 |
|
#11 |
Senior Member
Chris Sideroff
Join Date: Mar 2009
Location: Ottawa, ON, CAN
Posts: 434
Rep Power: 22 |
Vidyadhar,
1. Yes, the solver is fully parallel just like all the other solvers. 2. Yes, the fluid is a continuum and the particles are treated as discrete Lagragian objects. 3. The particles behavior is defined by their density and size. With the current implementation the particles do not change size. Therefore, yes, you could specific particle properties for a solid material. Regards, Chris |
|
September 19, 2016, 02:39 |
DPMFoam, MPPICFoam vs vofLPTsolver????
|
#12 |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Hello Chris,
Thanks for the reply. Actually, I want to simulate lagrangian particle (bubbles as well as solids) flows in a liquid. Can you clarify if DPMFoam solver or MPPICFoam solver of openfoam 4.0 can't do this job. Do they solve only solid particle tracking? Because I want to simulate bubble tracking also! If I have to use caelus, can you help me how to install it on ubuntu 16.04. Or Is caelus can be installed on ubuntu 14.04 only? Thanks |
|
September 19, 2016, 07:51 |
Paralllel command in caelus 6.04??
|
#13 |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Hi,
I just installed caelus6.04. How to execute commands in terminal to use this software. Also, how to execute a solver in parallel. Which command is to be used. Thanks. |
|
September 19, 2016, 07:59 |
Error while executing vofLPTSolver in parallel
|
#14 |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Hi Chris,
I have installed caelus 6.04. I tried using vofLPTSolver with the following command, but I got the errors messages as shown below; Can you please help me in using this solver in parallel!!! Thank you. caelus.py vofLPTSolver -parallel ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort [iitmech5:17307] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- Sorry! You were supposed to get help about: opal_init:startup:internal-failure But I couldn't open the help file: /opt/caelus/caelus-4.10/external/linux/openmpi-1.6.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry! -------------------------------------------------------------------------- [iitmech5:17307] [[INVALID],INVALID] ORTE_ERROR_LOG: Error in file runtime/orte_init.c at line 79 -------------------------------------------------------------------------- Sorry! You were supposed to get help about: mpi_init:startup:internal-failure But I couldn't open the help file: /opt/caelus/caelus-4.10/external/linux/openmpi-1.6.5/share/openmpi/help-mpi-runtime: No such file or directory. Sorry! |
|
September 19, 2016, 12:08 |
|
#15 |
Senior Member
Chris Sideroff
Join Date: Mar 2009
Location: Ottawa, ON, CAN
Posts: 434
Rep Power: 22 |
Vidyadhar,
Let me try to catch up to your questions. Let's diagnose the runtime issues first. It appears you are using Ubuntu Linux 16.04. Please confirm this. Which Caelus package did you install? The binary installer or the source-only installer? If it's the later, you will need to compile it using the provided BuildCaelus.py script in the top-level directory. If you're confident the binaries are there, can you try running the vofLPTSolver tutorial with the AllRun.py script? Let's start there and I'll answer the earlier questions after. -Chris |
|
September 19, 2016, 14:06 |
|
#16 |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Hello Chris,
Yes I am using ubuntu 16.04. I have installed the binary installer(Caelus 6.04 source code and binary installer). I have opened AllRun.py file in ACCM_bubbleCol3D of vofLPTSolver casefile and followed the operations therein: Executing blockMesh, setSet, createPatch, setFields, decomposePar force, vofLPTSolver--- Finally vofLPTSolver.log file has been created in the ACCM_bubbleCol3D directory which has the following messages: Sorry! You were supposed to get help about: opal_init:startup:internal-failure But I couldn't open the help file: /opt/caelus/caelus-4.10/external/linux/openmpi-1.6.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry! -------------------------------------------------------------------------- [iitmech5:09519] [[INVALID],INVALID] ORTE_ERROR_LOG: Error in file runtime/orte_init.c at line 79 -------------------------------------------------------------------------- Sorry! You were supposed to get help about: mpi_init:startup:internal-failure But I couldn't open the help file: /opt/caelus/caelus-4.10/external/linux/openmpi-1.6.5/share/openmpi/help-mpi-runtime: No such file or directory. Sorry! -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort [iitmech5:9519] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort [iitmech5:9520] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort [iitmech5:9521] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort [iitmech5:9522] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort [iitmech5:9523] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- -------------------------------------------------------------------------- Sorry! You were supposed to get help about: opal_init:startup:internal-failure But I couldn't open the help file: /opt/caelus/caelus-4.10/external/linux/openmpi-1.6.5/share/openmpi/help-opal-runtime.txt: No such file or directory. Sorry! -------------------------------------------------------------------------- [iitmech5:09526] [[INVALID],INVALID] ORTE_ERROR_LOG: Error in file runtime/orte_init.c at line 79 -------------------------------------------------------------------------- Sorry! You were supposed to get help about: mpi_init:startup:internal-failure But I couldn't open the help file: /opt/caelus/caelus-4.10/external/linux/openmpi-1.6.5/share/openmpi/help-mpi-runtime: No such file or directory. Sorry! -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL: your MPI job will now abort [iitmech5:9526] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- mpiexec detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[43169,1],1] Exit code: 1 -------------------------------------------------------------------------- Thank you for the patient reading.........!!!!! |
|
September 19, 2016, 20:10 |
|
#17 |
Senior Member
Chris Sideroff
Join Date: Mar 2009
Location: Ottawa, ON, CAN
Posts: 434
Rep Power: 22 |
Are the other steps running OK? What do the other log files say? Need to find out if it's one of the other steps failing.
|
|
October 1, 2016, 14:15 |
errors when boundary conditions are modified in vofLPT solver
|
#18 |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Hello Chris,
I have used the vofLPT solver successfully for the given tutorial case-ACCM_bubblecolumn3D, where the fluid is at rest and bubbles are moving. But, I want to simulate when the fluid is flowing through the column. (I have modified the column to be full of water.) The changes made by me are: In blockMeshDict: bottom is made to patch from wall, as I would like to flow in water through bottom patch. In U: bottom is made to fixedValue uniform (0 0.08 0), top as zeroGradient. In p_rgh: top is made to fixedValue uniform 0. I tried with various boundary conditions for U and p_rgh, but I could run only till time=3 s. After that Courant number is increasing largely, volume fraction Alpha1 is also becoming very large and I am getting errors as below: Courant Number mean: 2.387736012e+97 max: 1.998894153e+101 Interface Courant Number mean: 2.488915823e+96 max: 2.311900924e+100 Time = 3.81 Evolving bubbles Solving 3-D cloud bubbleCloud --> Cloud: bubbleCloud injector: model1 Added 2 new parcels Cloud: bubbleCloud Current number of parcels = 2907 Current mass in system = 0.0001668853125 Linear momentum = (-nan -nan -nan) |Linear momentum| = -nan Linear kinetic energy = -nan Rotational kinetic energy = 0 model1: number of parcels added = 9119 mass introduced = 0.0005142614062 Parcels absorbed into film = 0 New film detached parcels = 0 Parcel fate (number, mass) - escape = 147, 0 - stick = 0, 0 MULES: Solving for alpha1 Phase-1 volume fraction = 1.774412665e+87 Min(alpha1) = -1.333103629e+102 Max(alpha1) = 2.357171278e+102 MULES: Solving for alpha1 Phase-1 volume fraction = -4.252960616e+259 Min(alpha1) = -5.124760501e+280 Max(alpha1) = 5.031106494e+280 PIMPLE: iteration 1 DILUPBiCG: Solving for Ux: solution singularity DILUPBiCG: Solving for Uy: solution singularity DILUPBiCG: Solving for Uz: solution singularity GAMG: Solving for p_rgh, Initial residual = nan, Final residual = nan, No Iterations 1000 GAMG: Solving for p_rgh, Initial residual = nan, Final residual = nan, No Iterations 1000 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 4 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [2] [2] [2] --> CAELUS FATAL IO ERROR: [2] wrong token type - expected Scalar, found on line 0 the word 'nan' [2] [2] file: /home/mech5/vofLPTSolver/vofLPTSolvertestedfor5s/ACCM_20processors-COMPLETEWATER_BC_MODIFIED/processor2/system/data::solverPerformance:_rgh at line 0. [2] [2] From function operator>>(Istream&, Scalar&) [2] in file core/primitives/Scalar/doubleScalar/doubleScalar.cpp at line 98. [2] CAELUS parallel run exiting May I know the correct boundary conditions for the flow of fluid along with bubbles. Also, can you please brief me about p_rgh & how it is different from p. Thanks & Regards, vidyadhar |
|
October 2, 2016, 09:37 |
|
#19 | |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Quote:
1. I tried using vofLPT solver for bubbles. It was working fine. But, I have doubt regarding Youngs Modulus and Poissons ratio for bubbles in the bubble cloud properties file. Are they meant for bubbles only? Please clarify. 2.Now I would like to use the same for simulating solid particles. I have changed the density. But, I am facing the problem with viscosity of phase2 in transport properties file under constant directory. Can you please let me know how to go about this. Thank you. |
||
October 4, 2016, 09:27 |
|
#20 |
Senior Member
vidyadhar
Join Date: Jul 2016
Posts: 138
Rep Power: 10 |
Hello chris,
I am using vofLPT solver of caelus 6.04. In bubbleCloudProperties file: I would like to understand the meaning of the terms such as parcel and particle. Are they same or different? Also, how to calculate no.of parcels per second? Does this value depend on the run time of the simulation? injectionModels { model1 { type patchInjection; patchName inlet; // Name of patch SOI 0; // Start Of Injection flowRateProfile constant 1; // Flow rate profile relative to SOI massTotal 0.02025; // Total mass injected over injection duration parcelBasisType mass; // How are the number of particles calculated duration 7; // Duration of injection. NOTE: set to 1 for steady state U0 (0 0.08 0); // Initial parcel velocity parcelsPerSecond 3357; // Number of parcels to introduce per second sizeDistribution { type fixedValue; fixedValueDistribution { value 4e-03; } } } } Can you please help me in understanding this. Thank you |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Modelling a Released Gate | whk1992 | OpenFOAM Running, Solving & CFD | 1 | June 13, 2016 06:21 |
Caelus v5.10 Released | cnsidero | OpenFOAM Announcements from Other Sources | 12 | April 22, 2016 11:43 |
Caelus v5.04 released | cnsidero | OpenFOAM Announcements from Other Sources | 0 | April 30, 2015 13:28 |
OpenFOAM Version 1.3 Released | OpenFOAM discussion board administrator | OpenFOAM Announcements from ESI-OpenCFD | 0 | March 29, 2006 19:06 |
OpenFOAM Version 1.0.2 Released | Admin (Admin) | OpenFOAM Announcements from ESI-OpenCFD | 0 | January 12, 2005 07:33 |