|
[Sponsors] |
[v1812] redistributePar -decompose -parallel and missing /processor0 folder |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
May 5, 2021, 07:30 |
[v1812] redistributePar -decompose -parallel deletes /processor0 folder
|
#1 |
Member
Piotr Prusinski
Join Date: Oct 2009
Location: Warsaw, Poland
Posts: 67
Rep Power: 17 |
Hi,
For some specific reasons, I have to use OpenFOAM v1812. As my case is quite big, I will run it with 1000 CPU cores. Decomposition in a single thread mode would last ages, i.e. using decomposePar, so the redistributePar seems like a workaround. So I've figured out some simple procedure: Code:
mkdir processor0 cp -r 0/ processor0/. cp -r constant/ processor0/. mpirun -machinefile $PBS_NODEFILE -np $NCPUS redistributePar -decompose -parallel I've made similar test with OF v8.0 (without -decompose flag, this one is not supported here), but that ends up with: Code:
terminate called after throwing an instance of 'std::bad_array_new_length' what(): std::bad_array_new_length [wn41022:86218] *** Process received signal *** [wn41022:86218] Signal: Aborted (6) [wn41022:86218] Signal code: (-6) [wn41022:86218] [ 0] /lib64/libc.so.6[0x3b6f032510] [wn41022:86218] [ 1] /lib64/libc.so.6(gsignal+0x35)[0x3b6f032495] [wn41022:86218] [ 2] /lib64/libc.so.6(abort+0x175)[0x3b6f033c75] [wn41022:86218] [ 3] /mnt/opt/tools/slc6/gcc/7.2.0/lib64/libstdc++.so.6(_ZN9__gnu_cxx27__verbose_terminate_handlerEv+0x125)[0x7f0d6e000e35] [wn41022:86218] [ 4] /mnt/opt/tools/slc6/gcc/7.2.0/lib64/libstdc++.so.6(+0x8ec16)[0x7f0d6dffec16] [wn41022:86218] [ 5] /mnt/opt/tools/slc6/gcc/7.2.0/lib64/libstdc++.so.6(+0x8ec61)[0x7f0d6dffec61] [wn41022:86218] [ 6] /mnt/opt/tools/slc6/gcc/7.2.0/lib64/libstdc++.so.6(+0x8eea3)[0x7f0d6dffeea3] [wn41022:86218] [ 7] /mnt/opt/tools/slc6/gcc/7.2.0/lib64/libstdc++.so.6(+0x8db92)[0x7f0d6dffdb92] [wn41022:86218] [ 8] /mnt/opt/apps/slc6/openfoam/8.0-x86_64-gcc720-openmpi311/OpenFOAM-8/platforms/linux64GccDPInt32Opt/lib/libmeshTools.so(_ZNK4Foam14PrimitivePatchINS_4ListINS_4faceEEERKNS_5FieldINS_6VectorIdEEEEE12calcMeshDataEv+0x53c)[0x7f0d6f79736c] [wn41022:86218] [ 9] /mnt/opt/apps/slc6/openfoam/8.0-x86_64-gcc720-openmpi311/OpenFOAM-8/platforms/linux64GccDPInt32Opt/lib/libmeshTools.so(_ZNK4Foam14PrimitivePatchINS_4ListINS_4faceEEERKNS_5FieldINS_6VectorIdEEEEE16calcMeshPointMapEv+0x190)[0x7f0d6f799320] [wn41022:86218] [10] /mnt/opt/apps/slc6/openfoam/8.0-x86_64-gcc720-openmpi311/OpenFOAM-8/platforms/linux64GccDPInt32Opt/lib/libdynamicMesh.so(_ZN4Foam14polyTopoChange17compactAndReorderERKNS_8polyMeshEbbbRiRNS_5FieldINS_6VectorIdEEEERNS_4ListIiEESC_RNSA_INS_9objectMapEEESF_SF_SF_SF_SF_SF_SF_RNSA_INS_3MapIiEEEESC_SC_SJ_+0x43d)[0x7f0d6efe4f9d] [wn41022:86218] [11] /mnt/opt/apps/slc6/openfoam/8.0-x86_64-gcc720-openmpi311/OpenFOAM-8/platforms/linux64GccDPInt32Opt/lib/libdynamicMesh.so(_ZN4Foam14polyTopoChange10changeMeshERNS_8polyMeshEbbbb+0x285)[0x7f0d6efe7465] [wn41022:86218] [12] /mnt/opt/apps/slc6/openfoam/8.0-x86_64-gcc720-openmpi311/OpenFOAM-8/platforms/linux64GccDPInt32Opt/lib/libdynamicMesh.so(_ZN4Foam16fvMeshDistribute7repatchERKNS_4ListIiEERNS1_IS2_EE+0x4dd)[0x7f0d6f1851ad] [wn41022:86218] [13] /mnt/opt/apps/slc6/openfoam/8.0-x86_64-gcc720-openmpi311/OpenFOAM-8/platforms/linux64GccDPInt32Opt/lib/libdynamicMesh.so(_ZN4Foam16fvMeshDistribute17deleteProcPatchesEi+0x125)[0x7f0d6f186065] [wn41022:86218] [14] /mnt/opt/apps/slc6/openfoam/8.0-x86_64-gcc720-openmpi311/OpenFOAM-8/platforms/linux64GccDPInt32Opt/lib/libdynamicMesh.so(_ZN4Foam16fvMeshDistribute10distributeERKNS_4ListIiEE+0xda8)[0x7f0d6f18a168] [wn41022:86218] [15] redistributePar[0x453a5d] [wn41022:86218] [16] /lib64/libc.so.6(__libc_start_main+0xfd)[0x3b6f01ed1d] [wn41022:86218] [17] redistributePar[0x454e81] [wn41022:86218] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 0 with PID 0 on node wn41022 exited on signal 6 (Aborted). -------------------------------------------------------------------------- Any ideas? Last edited by piprus; May 6, 2021 at 15:06. |
|
Tags |
processor0, redistributepar, v1812 |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
[snappyHexMesh] Running Snappyhexmesh in Parallel missing trisurface | Archoncomando | OpenFOAM Meshing & Mesh Conversion | 8 | September 12, 2024 04:26 |
[snappyHexMesh] SnappyHexMesh in parallel missing 0 folder | libindaniel2000 | OpenFOAM Meshing & Mesh Conversion | 0 | May 26, 2016 23:46 |
simpleFoam in parallel issue | plucas | OpenFOAM Running, Solving & CFD | 3 | July 17, 2013 12:30 |
Problems running in parallel - missing controlDict | Argen | OpenFOAM Running, Solving & CFD | 4 | June 7, 2012 04:50 |
Parallel run: boundary condition missing on processorX/0 | lovecraft22 | OpenFOAM Running, Solving & CFD | 6 | May 29, 2012 17:06 |