|
[Sponsors] |
July 4, 2019, 14:19 |
particleCollector error
|
#1 |
Member
Joaquín Neira
Join Date: Oct 2017
Posts: 38
Rep Power: 9 |
Hello,
I'm trying to use particleCollector cloud function, but i get the following error: Code:
ln: ./lnInclude /home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/OpenFOAM/5.0-20180524/OpenFOAM-5.0-20180524/src/OpenFOAM/lnInclude/GeometricField.C(523): remark #15009: _ZN4Foam14GeometricFieldINS_6VectorIdEENS_12fvPatchFieldENS_7volMeshEEC1ERKNS_8IOobjectERKNS_3tmpIS5_EE has been targeted for automatic cpu dispatch /home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/OpenFOAM/5.0-20180524/OpenFOAM-5.0-20180524/src/OpenFOAM/lnInclude/GeometricField.C(142): remark #15009: _ZN4Foam14GeometricFieldINS_6VectorIdEENS_12fvPatchFieldENS_7volMeshEE20readOldTimeIfPresentEv has been targeted for automatic cpu dispatch /home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/OpenFOAM/5.0-20180524/OpenFOAM-5.0-20180524/src/OpenFOAM/lnInclude/GeometricBoundaryField.C(38): remark #15009: _ZN4Foam14GeometricFieldINS_6VectorIdEENS_12fvPatchFieldENS_7volMeshEE8Boundary9readFieldERKNS_16DimensionedFieldIS2_S4_EERKNS_10dictionaryE has been targeted for automatic cpu dispatch /home/lmod/software/MPI/intel/2019.2.187-GCC-8.2.0-2.31.1/impi/2019.2.187/OpenFOAM/5.0-20180524/OpenFOAM-5.0-20180524/src/OpenFOAM/lnInclude/GeometricBoundaryField.C(349): remark #15009: _ZN4Foam14GeometricFieldINS_6VectorIdEENS_12fvPatchFieldENS_7volMeshEE8BoundaryC1ERKNS_16DimensionedFieldIS2_S4_EERKS6_ has been targeted for automatic cpu dispatch [2] [2] [2] --> FOAM FATAL IO ERROR: [2] wrong token type - expected Scalar, found on line 0 the punctuation token '-' [2] [2] file: /home/jneira/Memoria/casos/corridas/S12-PruebaMalla/S12-PruebaMalla-2/processor2/0/uniform/lagrangian/saltationCloud1/saltationCloud1OutputProperties.cloudFunctionObject.particleCollector1.massTotal at line 0. [2] [2] From function Foam::Istream &Foam::operator>>(Foam::Istream &, double &) [2] in file lnInclude/Scalar.C at line 93. [2] FOAM parallel run exiting Code:
particleCollector1 { type particleCollector; mode polygon; polygons ( ( (0.0545 0 -0.0545066) (0.0545 0.0024 -0.0545066) (0.0545 0.0024 0.0545066) (0.0545 0 0.0545066) ) ( (0.01635 0 -0.0545066) (0.01635 0.0024 -0.0545066) (0.01635 0.0024 0.0545066) (0.01635 0 0.0545066) ) ); normal (1 0 0); negateParcelsOppositeNormal no; removeCollected no; surfaceFormat vtk; resetOnWrite no; log yes; } |
|
July 8, 2019, 15:10 |
|
#2 |
Member
ano
Join Date: Jan 2017
Location: Delft
Posts: 58
Rep Power: 10 |
If I understand the function correctly: It tells you that it wants to have access to the massFlow collected in a previous run. Unfortunately, you are at time step 0 and don't have anything collected yet.
Is there already a file processor2/0/uniform/lagrangian/saltationCloud1/saltationCloud1OutputProperties? Is it also in the main 0 dierectory? |
|
July 8, 2019, 16:07 |
|
#3 |
Member
Joaquín Neira
Join Date: Oct 2017
Posts: 38
Rep Power: 9 |
There is not any processor*/0/uniform directory, and I think there is no reason for them to exist. Maybe it's an error related to parallel run?
|
|
July 9, 2019, 06:38 |
|
#4 |
Member
ano
Join Date: Jan 2017
Location: Delft
Posts: 58
Rep Power: 10 |
That you could easily check by running it in serial.
If you have a look in file src/lagrangian/intermediate/submodels/CloudFunctionObjects/ParticleCollector/ParticleCollector.C , the error seems to appear during writing: It asks to get the scalarField massTotal using the function "getModelProperty", which calls "readIfPresent". You say for you it "massTotal" and even the file are not present in the time directory, so I would not expect it to try to read it. Code:
template<class CloudType> void Foam::ParticleCollector<CloudType>::write() { ... Field<scalar> faceMassTotal(mass_.size(), Zero); this->getModelProperty("massTotal", faceMassTotal); ... If you want to figure out what goes wrong, it would be good to 1. Try a serial run. If you get an error, post all error messages, they should tell in which function it exactly happens. ( In case you direct the output to a log file using ">", use "&>" to also direct the errors to the log file.) 2. Try to reset the field on write (from the source code I have the impression that your error appears before the reset, but you can give it a try): resetOnWrite yes; |
|
July 9, 2019, 12:30 |
|
#5 |
Member
Joaquín Neira
Join Date: Oct 2017
Posts: 38
Rep Power: 9 |
I ran it in serial (the simulation is heavy so that is not an option), and got no errors. So the error happens when I run in parallel. I have no idea of how to fix this error, can I just comment out the lines?
Edit: Running in parallel with resetOnWrite flag set to true solves the problem, but the result is not the one I need Last edited by cojua8; July 9, 2019 at 21:09. |
|
August 27, 2022, 04:24 |
|
#6 |
Member
|
Hello Joaquín Neira,
If resetOnWrite no; is set, could you run the case in parallel now? have you solved the problem? |
|
Tags |
collector, lagrangian, particle |
|
|