|
[Sponsors] |
August 15, 2013, 02:17 |
empty decomposePar
|
#1 |
Senior Member
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,208
Rep Power: 27 |
decomposePar results processors files only including constant folders without fields.why it behaves so now?
Code:
[0] [2] --> FOAM FATAL IO ERROR: [2] [2] --> FOAM FATAL IO ERROR: [2] cannot find file [2] [2] file: /home/ehsan/Desktop/WR_3/processor2/0/p at line 0. [2] [2] From function regIOobject::readStream() [2] in file db/regIOobject/regIOobjectRead.C at line 73. [2] FOAM parallel run exiting [2] [0] cannot find file [0] [0] file: /home/ehsan/Desktop/WR_3/processor0/0/p at line 0. [0] [0] [3] From function [3] [3] --> FOAM FATAL IO ERROR: [3] cannot find file [3] [3] file: /home/ehsan/Desktop/WR_3/processor3/0/p at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 73. [3] FOAM parallel run exiting [3] regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 73. [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot find file [1] [1] file: /home/ehsan/Desktop/WR_3/processor1/0/p at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 73. [1] FOAM parallel run exiting [1] [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun has exited due to process rank 3 with PID 27486 on node Ehsan-com exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [Ehsan-com:27480] 3 more processes have sent help message help-mpi-api.txt / mpi-abort [Ehsan-com:27480] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Killing PID 27479 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 27479 was already dead
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King. To Be or Not To Be,Thats the Question! The Only Stupid Question Is the One that Goes Unasked. |
|
August 15, 2013, 03:01 |
|
#2 |
Senior Member
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,208
Rep Power: 27 |
it even occurs when I use serial run.but p file is in the time folder like other fields and like before.
Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.2.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.2.0-b363e8d14789 Exec : rhoCentralFoamGasCont Date : Aug 15 2013 Time : 10:25:54 Host : "Ehsan-com" PID : 2529 Case : /home/ehsan/Desktop/WR_3 nProcs : 1 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Create mesh for time = 0.0103469 Reading thermophysical properties Selecting thermodynamics package { type hePsiThermo; mixture pureMixture; transport sutherland; thermo janaf; equationOfState perfectGas; specie specie; energy sensibleEnthalpy; } --> FOAM FATAL IO ERROR: cannot find file file: /home/ehsan/Desktop/WR_3/0.0103469/p at line 0. From function regIOobject::readStream() in file db/regIOobject/regIOobjectRead.C at line 73. FOAM exiting
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King. To Be or Not To Be,Thats the Question! The Only Stupid Question Is the One that Goes Unasked. |
|
August 15, 2013, 03:55 |
|
#3 |
Senior Member
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,208
Rep Power: 27 |
I deleted that time folder and went to a later one and used it:
Code:
decomposePar -time '0.01034593' -force Code:
[0] [1] [1] [1] --> FOAM FATAL IO ERROR: [1] cannot find file [1] [1] file: /home/ehsan/Desktop/WR_3/processor1/0.0103459/p at line 0. [1] [1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 73. [1] FOAM parallel run exiting [1] [0] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0] --> FOAM FATAL IO ERROR: [0] cannot find file [0] [0] file: /home/ehsan/Desktop/WR_3/processor0/0.0103459/p at line 0. [0] [0] From function regIOobject::readStream() [0] in file db/regIOobject/regIOobjectRead.C at line 73. [0] FOAM parallel run exiting [0] -------------------------------------------------------------------------- mpirun has exited due to process rank 1 with PID 5374 on node Ehsan-com exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [Ehsan-com:05372] 1 more process has sent help message help-mpi-api.txt / mpi-abort [Ehsan-com:05372] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Killing PID 5369 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 5369 was already dead
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King. To Be or Not To Be,Thats the Question! The Only Stupid Question Is the One that Goes Unasked. |
|
August 21, 2013, 08:27 |
|
#4 |
Senior Member
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,208
Rep Power: 27 |
it repeated after a while in the new case I established again:
Code:
/*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 2.2.0 | | \\ / A nd | Web: www.OpenFOAM.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ Build : 2.2.0-b363e8d14789 Exec : decomposePar Date : Aug 21 2013 Time : 15:50:11 Host : "Ehsan-com" PID : 28022 Case : /home/ehsan/Desktop/WR_4 nProcs : 1 sigFpe : Enabling floating point exception trapping (FOAM_SIGFPE). fileModificationChecking : Monitoring run-time modified files using timeStampMaster allowSystemOperations : Disallowing user-supplied system call operations // * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * // Create time Decomposing mesh region0 Create mesh Calculating distribution of cells Selecting decompositionMethod simple Finished decomposition in 0.01 s Calculating original mesh data Distributing cells to processors Distributing faces to processors Distributing points to processors Constructing processor meshes Processor 0 Number of cells = 9450 Number of faces shared with processor 1 = 54 Number of processor patches = 1 Number of processor faces = 54 Number of boundary faces = 19304 Processor 1 Number of cells = 9450 Number of faces shared with processor 0 = 54 Number of faces shared with processor 2 = 54 Number of processor patches = 2 Number of processor faces = 108 Number of boundary faces = 19250 Processor 2 Number of cells = 9450 Number of faces shared with processor 1 = 54 Number of faces shared with processor 3 = 54 Number of processor patches = 2 Number of processor faces = 108 Number of boundary faces = 19250 Processor 3 Number of cells = 9450 Number of faces shared with processor 2 = 54 Number of processor patches = 1 Number of processor faces = 54 Number of boundary faces = 19304 Number of processor faces = 162 Max number of cells = 9450 (0% above average 9450) Max number of processor patches = 2 (33.333333333333% above average 1.5) Max number of faces between processors = 108 (33.333333333333% above average 81) Time = 0.0111974 Processor 0: field transfer Processor 1: field transfer Processor 2: field transfer Processor 3: field transfer End. Code:
[1] From function regIOobject::readStream() [1] in file db/regIOobject/regIOobjectRead.C at line 73. [1] FOAM parallel run exiting [1] [3] [3] [3] --> FOAM FATAL IO ERROR: [3] cannot find file [3] [3] file: /home/ehsan/Desktop/WR_4/processor3/0/p at line 0. [3] [3] From function regIOobject::readStream() [3] in file db/regIOobject/regIOobjectRead.C at line 73. -------------------------------------------------------------------------- mpirun has exited due to process rank 0 with PID 28183 on node Ehsan-com exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [Ehsan-com:28181] 2 more processes have sent help message help-mpi-api.txt / mpi-abort [Ehsan-com:28181] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Killing PID 28177 PyFoam WARNING on line 232 of file /usr/local/lib/python2.7/dist-packages/PyFoam/Execution/FoamThread.py : Process 28177 was already dead Getting LinuxMem: [Errno 2] No such file or directory: '/proc/28177/status'
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King. To Be or Not To Be,Thats the Question! The Only Stupid Question Is the One that Goes Unasked. |
|
August 21, 2013, 09:16 |
|
#5 |
Senior Member
Ehsan
Join Date: Oct 2012
Location: Iran
Posts: 2,208
Rep Power: 27 |
it resolved
I don't know why decomposePar command wanted to decompose .0111974 time folder while the correct name of last time folder was 0.01119737 I used: Code:
decomposePar -latestTime -force but when I wanted to run it again said that the field files aren't in folder 0.0111974. then I changed Code:
timePrecision 7; I'm not sure about the reason of these all!because I hadn't change anything in controlDict and all was like it was before the error! is it because of digit in time folders names? I had time folders like:0.004665(4 significant digits),0.00658536(6 significant digits) and the last time folder had 7 significant digits(0.01119737) then how should assign time precision so that can be sure it won't occur in future?a large number like 10 is better?
__________________
Injustice Anywhere is a Threat for Justice Everywhere.Martin Luther King. To Be or Not To Be,Thats the Question! The Only Stupid Question Is the One that Goes Unasked. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
decomposePar 4-core warning/error? | Boloar | OpenFOAM Bugs | 23 | April 8, 2014 09:57 |
Instable natural convection case | Peter88 | OpenFOAM | 5 | August 18, 2011 02:23 |
decomposePar gives errors | of_user_ | OpenFOAM | 1 | July 4, 2011 06:27 |
decomposePar: can use this decomposition method only for the whole mesh | aloeven | OpenFOAM Bugs | 0 | March 16, 2011 11:15 |
How to model the NR eqns in a domain with empty space | Vasilis | Main CFD Forum | 1 | April 14, 2009 05:35 |