|
[Sponsors] |
Syncing surfaceVectorField data across processors |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
November 2, 2016, 10:05 |
Syncing surfaceVectorField data across processors
|
#1 | |
New Member
Join Date: Apr 2011
Posts: 13
Rep Power: 15 |
Hello All,
I am trying to sync surfaceVectorField data across processors after calculating updated values at faces on processor patches in a mesh, and have it working from what seems to be the main processor to the others, but the 'slave' processors are not communicating?!? I am not sure why, and have scoured the forums for possibilities. Currently using OF 2.1.1 Fransje's reply seems to be what i am looking for, as they state that this operation should work and " Quote:
http://www.cfd-online.com/Forums/ope...ion-array.html http://openfoamcfd.sourceforge.net/d...ce.html#l00088 I am not sure if it is because i am implementing it wrong, or not. Here is the implementation. Code:
surfaceVectorField mySVField.boundaryField( otherField.size(), 0.0); //update face values in mySVField reduce( svField.boundaryField(), sumOp<surfaceVectorField>() ); here is the error that is produced Code:
[1] --> FOAM FATAL IO ERROR: [1] incorrect first token, expected <int> or '(', found on line 0 the word 'dimensions' [1] [1] file: IOstream at line 0[2] . [1] [1] From function operator>>(Istream&, List<T>&) [1] in file /usr/local/OpenFOAM-2.1.1/OpenFOAM-2.1.1/src/OpenFOAM/lnInclude/ListIO.C at line 149. [1] FOAM parallel run exiting [2] [1] [2] --> FOAM FATAL IO ERROR: [2] incorrect first token, expected <int> or '(', found on line 0 the word 'dimensions' [2] [2] file: IOstream at line 0. [2] [2] From function operator>>(Istream&, List<T>&) [2] in file /usr/local/OpenFOAM-2.1.1/OpenFOAM-2.1.1/src/OpenFOAM/lnInclude/ListIO.C at line 149. [2] FOAM parallel run exiting [2] -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- ==20306== ==20307== ==20307== Events : Ir ==20307== Collected : 153922434 ==20307== ==20307== I refs: 153,922,434 ==20306== Events : Ir ==20306== Collected : 157805077 ==20306== ==20306== I refs: 157,805,077 -------------------------------------------------------------------------- mpirun has exited due to process rank 1 with PID 20306 on node node26 exiting improperly. There are two reasons this could occur: 1. this process did not call "init" before exiting, but others in the job did. This can cause a job to hang indefinitely while it waits for all processes to call "init". By rule, if one process calls "init", then ALL processes must call "init" prior to termination. 2. this process called "init", but exited without calling "finalize". By rule, all processes that call "init" MUST call "finalize" prior to exiting or it will be considered an "abnormal termination" This may have caused other processes in the application to be terminated by signals sent by mpirun (as reported here). -------------------------------------------------------------------------- [node26:20304] 1 more process has sent help message help-mpi-api.txt / mpi-abort [node26:20304] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages Note: when i run without this, as in, when i was running normally, the data was not syncing, and producing this result(it ran to completion...just not correctly) Code:
procBoundary0to2 { type processor; value uniform (0 0 0); } Code:
procBoundary2to0 { type processor; value nonuniform List<vector> 4((0 0 0) (-0 0 0) (0.2236068 0.9472136 0) (0 0 0)); } |
||
Tags |
openfoam 2.1.1, parallel, pstream |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Writing/output of simple data into an ASCII-File | matthi | OpenFOAM Programming & Development | 24 | January 8, 2019 14:28 |
Exporting data (cell values, ascii) and obtaining more data than cells! | TfG | FLUENT | 3 | April 3, 2015 01:18 |
Data Produced From Fine Marine Cant Match with The Experimental Data | PeiSan | Fidelity CFD | 4 | August 23, 2014 06:33 |
Problem running in parralel | Val | OpenFOAM Running, Solving & CFD | 1 | June 12, 2014 03:47 |
How to update polyPatchbs localPoints | liu | OpenFOAM Running, Solving & CFD | 6 | December 30, 2005 18:27 |