CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Running, Solving & CFD

Boundary condition problem (Freestream)

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   May 25, 2021, 12:15
Default Boundary condition problem (Freestream)
  #1
New Member
 
Dmitry
Join Date: Sep 2018
Posts: 12
Rep Power: 8
KAYANO is on a distinguished road
Hello everyone!

There was a problem with the boundary conditions. Running the solution gives an error of this type.
HTML Code:
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

Create mesh for time = 0


SIMPLE: no convergence criteria found. Calculations will run for 1500 steps.

Reading field p

[5] 
[5] 
[5] --> FOAM FATAL IO ERROR: (openfoam-2012)
[5] Cannot find patchField entry for procBoundary5to1
[5] 
[5] file: /mnt/c/Users/User/Desktop/Joby_plane_5/processor5/0/p.boundaryField at line 26 to 32.
[5] 
[5]     From void Foam::GeometricField<Type, PatchField, GeoMesh>::Boundary::readField(const Foam::DimensionedField<TypeR, GeoMesh>&, const Foam::dictionary&) [with Type = double; PatchField = Foam::fvPatchField; GeoMesh = Foam::volMesh]
[5]     in file /home/pawan/OpenFOAM/OpenFOAM/OpenFOAM-v2012/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 172.
[5] 
FOAM parallel run exiting
[5] 
[1] 
[1] 
[1] --> FOAM FATAL IO ERROR: (openfoam-2012)
[1] Cannot find patchField entry for procBoundary1to0
[1] 
[1] file: /mnt/c/Users/User/Desktop/Joby_plane_5/processor1/0/p.boundaryField at line 26 to 32.
[1] 
[1]     From void Foam::GeometricField<Type, PatchField, GeoMesh>::Boundary::readField(const Foam::DimensionedField<TypeR, GeoMesh>&, const Foam::dictionary&) [with Type = double; PatchField = Foam::fvPatchField; GeoMesh = Foam::volMesh]
[1]     in file /home/pawan/OpenFOAM/OpenFOAM/OpenFOAM-v2012/src/OpenFOAM/lnInclude/GeometricBoundaryField.C at line 172.
[1] 
FOAM parallel run exiting
[1] 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
I am attaching the complete log and source files. Thank you!
Attached Files
File Type: zip Log.zip (23.0 KB, 2 views)
File Type: zip 0.orig.zip (2.6 KB, 0 views)
File Type: zip system.zip (11.8 KB, 0 views)
KAYANO is offline   Reply With Quote

Old   May 25, 2021, 16:09
Default
  #2
HPE
Senior Member
 
HPE's Avatar
 
Herpes Free Engineer
Join Date: Sep 2019
Location: The Home Under The Ground with the Lost Boys
Posts: 931
Rep Power: 13
HPE is on a distinguished road
Hi,

What does the log file say for the "decomposePar" command?
HPE is offline   Reply With Quote

Old   May 25, 2021, 16:43
Default
  #3
New Member
 
Dmitry
Join Date: Sep 2018
Posts: 12
Rep Power: 8
KAYANO is on a distinguished road
Quote:
Originally Posted by HPE View Post
Hi,

What does the log file say for the "decomposePar" command?
Hi! This is what is in the log.
HTML Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  v2012                                 |
|   \\  /    A nd           | Website:  www.openfoam.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : _7bdb509494-20201222 OPENFOAM=2012
Arch   : "LSB;label=32;scalar=64"
Exec   : decomposePar -decomposeParDict system/decomposeParDict.8
Date   : May 25 2021
Time   : 17:19:00
Host   : DESKTOP-PSED6K8
PID    : 831
I/O    : uncollated
Case   : /mnt/c/Users/User/Desktop/Joby_plane_5
nProcs : 1
trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 5, maxFileModificationPolls 20)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time



Decomposing mesh region0

Create mesh

Calculating distribution of cells
Selecting decompositionMethod hierarchical [8]

Finished decomposition in 0.03 s

Calculating original mesh data

Distributing cells to processors

Distributing faces to processors

Distributing points to processors

Constructing processor meshes

Processor 0
    Number of cells = 8000
    Number of faces shared with processor 1 = 800
    Number of faces shared with processor 4 = 400
    Number of processor patches = 2
    Number of processor faces = 1200
    Number of boundary faces = 1600

Processor 1
    Number of cells = 8000
    Number of faces shared with processor 0 = 800
    Number of faces shared with processor 2 = 800
    Number of faces shared with processor 5 = 400
    Number of processor patches = 3
    Number of processor faces = 2000
    Number of boundary faces = 800

Processor 2
    Number of cells = 8000
    Number of faces shared with processor 1 = 800
    Number of faces shared with processor 3 = 800
    Number of faces shared with processor 6 = 400
    Number of processor patches = 3
    Number of processor faces = 2000
    Number of boundary faces = 800

Processor 3
    Number of cells = 8000
    Number of faces shared with processor 2 = 800
    Number of faces shared with processor 7 = 400
    Number of processor patches = 2
    Number of processor faces = 1200
    Number of boundary faces = 1600

Processor 4
    Number of cells = 8000
    Number of faces shared with processor 0 = 400
    Number of faces shared with processor 5 = 800
    Number of processor patches = 2
    Number of processor faces = 1200
    Number of boundary faces = 1600

Processor 5
    Number of cells = 8000
    Number of faces shared with processor 1 = 400
    Number of faces shared with processor 4 = 800
    Number of faces shared with processor 6 = 800
    Number of processor patches = 3
    Number of processor faces = 2000
    Number of boundary faces = 800

Processor 6
    Number of cells = 8000
    Number of faces shared with processor 2 = 400
    Number of faces shared with processor 5 = 800
    Number of faces shared with processor 7 = 800
    Number of processor patches = 3
    Number of processor faces = 2000
    Number of boundary faces = 800

Processor 7
    Number of cells = 8000
    Number of faces shared with processor 3 = 400
    Number of faces shared with processor 6 = 800
    Number of processor patches = 2
    Number of processor faces = 1200
    Number of boundary faces = 1600

Number of processor faces = 6400
Max number of cells = 8000 (0% above average 8000)
Max number of processor patches = 3 (20% above average 2.5)
Max number of faces between processors = 2000 (25% above average 1600)

Time = 0

Processor 0: field transfer
Processor 1: field transfer
Processor 2: field transfer
Processor 3: field transfer
Processor 4: field transfer
Processor 5: field transfer
Processor 6: field transfer
Processor 7: field transfer

End
KAYANO is offline   Reply With Quote

Old   May 26, 2021, 05:41
Default
  #4
HPE
Senior Member
 
HPE's Avatar
 
Herpes Free Engineer
Join Date: Sep 2019
Location: The Home Under The Ground with the Lost Boys
Posts: 931
Rep Power: 13
HPE is on a distinguished road
Hi,

"decomposePar" does not show any issue; but, the error message indicates a parallelisation issue. Can you attach the case, if there is any chance? Weird..
HPE is offline   Reply With Quote

Old   May 26, 2021, 06:06
Default
  #5
New Member
 
Dmitry
Join Date: Sep 2018
Posts: 12
Rep Power: 8
KAYANO is on a distinguished road
Quote:
Originally Posted by HPE View Post
Hi,

"decomposePar" does not show any issue; but, the error message indicates a parallelisation issue. Can you attach the case, if there is any chance? Weird..
Hi, here is the download link: https://disk.yandex.ru/d/-Q0L39_edFSoZA
KAYANO is offline   Reply With Quote

Old   June 1, 2021, 06:57
Default
  #6
New Member
 
Dmitry
Join Date: Sep 2018
Posts: 12
Rep Power: 8
KAYANO is on a distinguished road
Hello everyone! Can someone please help with the problem?
KAYANO is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Freestream boundary condition YJ Lee OpenFOAM Running, Solving & CFD 14 June 5, 2020 03:54
Centrifugal fan j0hnny CFX 13 October 1, 2019 14:55
Radiation interface hinca CFX 15 January 26, 2014 18:11
Low Mixing time Problem Mavier CFX 5 April 29, 2013 01:00
RPM in Wind Turbine Pankaj CFX 9 November 23, 2009 05:05


All times are GMT -4. The time now is 11:19.