|
[Sponsors] |
chtMultiRegionFoam: problem with the tutorial |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
April 5, 2012, 12:12 |
chtMultiRegionFoam: problem with the tutorial
|
#1 |
Senior Member
Samuele Z
Join Date: Oct 2009
Location: Mozzate - Co - Italy
Posts: 520
Rep Power: 19 |
Dear All,
I am trying to learn to use the chtMultiRegionFoam and I am starting with the tutorial. The 1st tutorial I wanted to run is the multiRegionHeater. I enter the dir of the case and I give the command: Code:
./Allrun Code:
lab@lab-laptop:~/Scrivania/multiRegionHeater$ ./Allrun Running blockMesh on /home/lab/Scrivania/multiRegionHeater Running topoSet on /home/lab/Scrivania/multiRegionHeater Running splitMeshRegions on /home/lab/Scrivania/multiRegionHeater Running chtMultiRegionFoam in parallel on /home/lab/Scrivania/multiRegionHeater using 2 processes --> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 139. FOAM exiting --> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 139. FOAM exiting --> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 139. FOAM exiting --> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 139. FOAM exiting --> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 139. FOAM exiting creating files for paraview post-processing created 'multiRegionHeater{bottomAir}.OpenFOAM' created 'multiRegionHeater{topAir}.OpenFOAM' created 'multiRegionHeater{heater}.OpenFOAM' created 'multiRegionHeater{leftSolid}.OpenFOAM' created 'multiRegionHeater{rightSolid}.OpenFOAM' Also, where can I find an explanation of this solver, since I guess it is a bit difficult to set everything properly? Thanks, Samuele |
|
April 5, 2012, 12:53 |
|
#2 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings Samuele,
The Allrun script uses a method of keeping a log of every application that is executed. If you look into the files "log.*", you should find the reason why things aren't working as expected. As for documentation, I'm not familiar with any document online for the "chtMultiRegion*Foam" solvers, so I suggest that you search for it Failing that, start studying the files that the tutorial case has, as well as looking at the code for the solver itself. Best regards, Bruno
__________________
|
|
April 6, 2012, 03:59 |
|
#3 |
Senior Member
Samuele Z
Join Date: Oct 2009
Location: Mozzate - Co - Italy
Posts: 520
Rep Power: 19 |
I looked at the different log files and I noticed that there are problems in the log.chtMultiRegionFoam and in the log.reconstructPar.
These are the 2 files: Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.1.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.1.0-0bc225064152 Exec : chtMultiRegionFoam -parallel Date : Apr 05 2012 Time : 16:55:55 Host : "lab-laptop" PID : 7962 [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0] [0] --> FOAM FATAL ERROR: [0] "/home/lab/Scrivania/multiRegionHeater/system/decomposeParDict" specifies 4 processors but job was started with 2 processors. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 7962 on node lab-laptop exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.1.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.1.0-0bc225064152 Exec : reconstructPar -region rightSolid Date : Apr 05 2012 Time : 16:55:56 Host : "lab-laptop" PID : 7968 Case : /home/lab/Scrivania/multiRegionHeater nProcs : 1 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Thanks a lot, Samuele |
|
April 6, 2012, 15:48 |
|
#4 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi Samuele,
You didn't specify if you had changed anything in the simulation case. Anyway, here are the steps to fix things:
Bruno
__________________
|
|
April 10, 2012, 04:22 |
|
#5 |
Senior Member
Samuele Z
Join Date: Oct 2009
Location: Mozzate - Co - Italy
Posts: 520
Rep Power: 19 |
Hi Bruno and thanks for answering.
The steps you suggested make the tutorial work fine. Thanks a lot, Samuele |
|
June 5, 2012, 09:54 |
|
#6 |
Member
Alain Martin
Join Date: Mar 2009
Posts: 40
Rep Power: 17 |
I does not work for me.
In the processor* directories, I don't have any time directories after the run except 0/ and constant/ processor0: 0 constant processor1: 0 constant processor2: 0 constant processor3: 0 constant All the time dir are in the base dir 0 10 20 30 Allclean Allrun ....... constant makeCellSets.setSet processor0 processor1 processor2 processor3 README.txt system This is with Ubuntu 10.04 Everything else is ok and this was working with older version. Any suggestions? Thanks This is written in a log file with mpirunDebug *** An error occurred in MPI_Init *** before MPI was initialized *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort) Last edited by jam; June 5, 2012 at 13:48. |
|
June 6, 2012, 17:11 |
|
#7 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Greetings Alain,
Can you be a bit more specific?
Bruno
__________________
|
|
June 7, 2012, 06:44 |
|
#8 |
Senior Member
Samuele Z
Join Date: Oct 2009
Location: Mozzate - Co - Italy
Posts: 520
Rep Power: 19 |
Dear Alain,
I suggest you to try to run the case on a single processor. Than you can try to parallelize it! Samuele |
|
June 7, 2012, 19:00 |
|
#9 |
Member
Alain Martin
Join Date: Mar 2009
Posts: 40
Rep Power: 17 |
@wyldckat
1. It is 10.04 2. 2.1.0 3. From the deb pkg 4. All parallel tutorials give the same messages mpirun -np 4 xxxxxx works as expected but the work is not distributed mpirun -np 4 xxxxxx -parallel gives the error messages I went back to the previous version 2.0.0 and everything is ok. |
|
June 8, 2012, 17:38 |
|
#10 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi Alain,
OK, it would be really useful to see a good log with errors, so it can be easier to diagnose the real error. Please run mpirun in a similar way to this: Code:
mpirun -n 4 interFoam -parallel > log.interFoam 2>&1 Then compress the file: Code:
tar -czf log.interFoam.tar.gz log.interFoam Another thing you can look at is if there is any folder and/or file present at "~/.OpenFOAM/", which is where OpenFOAM will look for global configuration files for the user. Best regards, Bruno
__________________
|
|
June 9, 2012, 17:10 |
|
#11 |
Member
Alain Martin
Join Date: Mar 2009
Posts: 40
Rep Power: 17 |
I found the only method that works so far with my setup:
http://www.cfd-online.com/Forums/ope...12-04-lts.html All others (2.1.1 deb pkg , 2.1.1 tgz source) are not compiling or running as they should. Only the 2.1.x from git works flawlessly. Thanks for the suggestions anyway. |
|
October 16, 2013, 21:50 |
|
#12 |
Senior Member
Mominul MuKuT
Join Date: Mar 2009
Location: Bangladesh
Posts: 124
Rep Power: 17 |
In my case, the tutorial works fine, but after modifying the geometry by topoSetDict, after running Allrun script I have found following errors in log files as shown below:
log.reconstructPar Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.2.1 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.2.1-57f3c3617a2d Exec : reconstructPar -allRegions Date : Oct 16 2013 Time : 19:53:41 Host : "mukut-Endeavor-MR3300" PID : 5013 Case : /home/mukut/OpenFOAM/mukut-2.2.1/run/tutorials/heatTransfer/chtMultiRegionFoam/multiRegionHeater nProcs : 1 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time --> FOAM FATAL ERROR: No times selected From function reconstructPar in file reconstructPar.C at line 178. FOAM exiting Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.2.1 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.2.1-57f3c3617a2d Exec : chtMultiRegionFoam -parallel Date : Oct 16 2013 Time : 16:15:49 Host : "mukut-Endeavor-MR3300" PID : 4242 Case : /home/mukut/OpenFOAM/mukut-2.2.1/run/tutorials/heatTransfer/chtMultiRegionFoam/multiRegionHeater nProcs : 4 Slaves : 3 ( "mukut-Endeavor-MR3300.4243" "mukut-Endeavor-MR3300.4244" "mukut-Endeavor-MR3300.4245" ) Pstream initialized with: floatTransfer : 0 nProcsSimpleSum : 0 commsType : nonBlocking polling iterations : 0 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create fluid mesh for region bottomAir for time = 0 Create fluid mesh for region topAir for time = 0 Create solid mesh for region heater for time = 0 [0] [0] [0] --> FOAM FATAL ERROR: [0] Cannot find file "points" in directory "heater/polyMesh" in times 0 down to constant [0] [0] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&) [0] in file db/Time/findInstance.C at line 203. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [1] [1] [1] --> FOAM FATAL ERROR: [1] Cannot find file "points" in directory "heater/polyMesh" in times 0 down to constant [1] [1] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&) [1] in file db/Time/findInstance.C at line 203. [1] FOAM parallel run exiting [1] [2] [2] [2] --> FOAM FATAL ERROR: [2] Cannot find file "points" in directory "heater/polyMesh" in times 0 down to constant [2] [2] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&) [2] in file db/Time/findInstance.C at line 203. [2] FOAM parallel run exiting [2] [3] [3] [3] --> FOAM FATAL ERROR: [3] Cannot find file "points" in directory "heater/polyMesh" in times 0 down to constant [3] [3] From function Time::findInstance(const fileName&, const word&, const IOobject::readOption, const word&) [3] in file db/Time/findInstance.C at line 203. [3] FOAM parallel run exiting [3] -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 4242 on node mukut-Endeavor-MR3300 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [mukut-Endeavor-MR3300:04241] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [mukut-Endeavor-MR3300:04241] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Best regards, Mukut |
|
October 17, 2013, 15:38 |
|
#13 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Hi Mukut,
Not much information to work with. All I can guess is:
Best regards, Bruno
__________________
|
|
October 17, 2013, 21:06 |
|
#14 | |
Senior Member
Mominul MuKuT
Join Date: Mar 2009
Location: Bangladesh
Posts: 124
Rep Power: 17 |
Thank you Mr. Bruno,
I found some mistakes in fvSchemes and fvSolutions file of a region that I have created after modifying the tutorial geometry and I have corrected those. Now simulation is going on but it takes long time, I have modified the controlDict as follows to complete simulation in a shorter time... Code:
/*--------------------------------*- C++ -*----------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.2.1 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ FoamFile { version 2.0; format ascii; class dictionary; location "system"; object controlDict; } // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // libs ( "libcompressibleTurbulenceModel.so" "libcompressibleRASModels.so" ); application chtMultiRegionFoam; startFrom latestTime; startTime 0.1; stopAt endTime; endTime 0.2; deltaT 0.1; writeControl adjustableRunTime; writeInterval 0.1; purgeWrite 0; writeFormat binary; writePrecision 8; writeCompression off; timeFormat general; timePrecision 6; runTimeModifiable yes; maxCo 0.3; // Maximum diffusion number maxDi 10.0; adjustTimeStep yes; // ************************************************************************* // How can I reduced the time of simulation? Best regards, mukut Quote:
|
||
October 18, 2013, 04:55 |
|
#15 |
Retired Super Moderator
Bruno Santos
Join Date: Mar 2009
Location: Lisbon, Portugal
Posts: 10,981
Blog Entries: 45
Rep Power: 128 |
Quick answer: Sorry, not enough information to work with here
Knowing the characteristics of the mesh and the solver used, as well as the contents of the "fv*" files, and how exactly you are running the case, would help.
__________________
|
|
October 24, 2013, 04:54 |
|
#16 |
Senior Member
|
i have question here, your time step is 0.1s who your solver computes for time with another step ? one way for reducing time is reducing number of cells especially in axis with small variation, then you can refine your mesh based on results of coarse mesh, you can find this method in openFoam user manual in chapter two i think. Good Luck, |
|
October 24, 2013, 05:12 |
|
#17 |
Senior Member
Mominul MuKuT
Join Date: Mar 2009
Location: Bangladesh
Posts: 124
Rep Power: 17 |
Thanks for reply. I have changed to steady state solver: chtMultiRegionSimpleFoam. Now it worked with my modified geometry.
|
|
March 27, 2014, 10:01 |
|
#18 | |
Senior Member
Derek Mitchell
Join Date: Mar 2014
Location: UK, Reading
Posts: 172
Rep Power: 13 |
Quote:
|
||
October 12, 2017, 18:01 |
|
#19 | |
New Member
Miguel David Méndez Bohórquez
Join Date: Sep 2016
Location: Bogotá
Posts: 10
Rep Power: 10 |
Quote:
I hope you can help me. I am running a Multiregion case. I have read the instructions in the Allrun script of this tutorial. When I descomposed my case and every single processor has taken their respective part of each region, but, when I am going to release the solver appears the following problem: Cannot find file "points" in directory "polyMesh" in times 23.6 down to constant. (23.6 is my starting time). I have checked that every region in every processor has the repective constant folder with the respective polyMesh/points file. I executed this line: mpirun -np 4 my_solver -parallel 1> runlog Does exit an special statement for running multiRegion cases? I hope have been clear. Best regards, Miguel. |
||
March 20, 2019, 01:01 |
Same Error
|
#20 |
New Member
Arpan Sircar
Join Date: Mar 2017
Posts: 8
Rep Power: 9 |
Cannot find file "points" in directory "polyMesh" in times 23.6 down to constant. (23.6 is my starting time).
I get the same error with decomposePar for a multi-region case, did you find any solution to this problem ? Thanks Arpan |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Problem on Fluent Tutorial: Horizontal Film Boilig | Feng | FLUENT | 2 | April 13, 2013 06:34 |
newbie problem with cavity tutorial | miki | OpenFOAM Running, Solving & CFD | 8 | September 2, 2012 16:22 |
Problem setting with chtmultiregionFoam | Antonin | OpenFOAM | 10 | April 24, 2012 10:50 |
Solver problem in Oscillating Plate tutorial | vovogoal | CFX | 1 | November 22, 2011 10:54 |
Help! Compiled UDF problem 4 Wave tank tutorial | Shane | FLUENT | 1 | September 3, 2010 03:32 |