CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Community Contributions > OpenFOAM CC Toolkits for Fluid-Structure Interaction

[solidMechanics] Support thread for "Solid Mechanics Solvers added to OpenFOAM Extend"

Register Blogs Community New Posts Updated Threads Search

Like Tree134Likes

Closed Thread
 
LinkBack Thread Tools Search this Thread Display Modes
Old   March 31, 2021, 01:38
Default
  #601
Senior Member
 
Hojatollah Gholami
Join Date: Jan 2019
Posts: 171
Rep Power: 7
Hgholami is on a distinguished road
Dear All
I looked at foam-extend4.1 and overset feature of solids4Foam. As it seen the overset tutorial with serial run is OK. But I want to know, does the code run with parallel simulation, if yes, How?
Thanks
Hgholami is offline  

Old   March 31, 2021, 01:45
Default
  #602
Senior Member
 
alberto
Join Date: Apr 2016
Location: Mexico
Posts: 119
Rep Power: 10
dewey is on a distinguished road
Yes, solids4foam can be used in parallel.
You have to use
decomposePar -region fluid
decomposePar -region solid

Then..
mpirun -np "cores" solids4foam -parallel
dewey is offline  

Old   March 31, 2021, 02:19
Default
  #603
Senior Member
 
Hojatollah Gholami
Join Date: Jan 2019
Posts: 171
Rep Power: 7
Hgholami is on a distinguished road
I have problem with parallel in overset tutorial. for example I use
decomposePar -region fluid
decomposePar -region solid
mpirun -np 4 solids4foam -parallel
But, the run crush after checking Selecting dynamicFvMesh newSubsetMotionSolverFvMesh
Quote:
Selecting fluidModel pimpleOversetFluid
Selecting dynamicFvMesh newSubsetMotionSolverFvMesh
[Ubuntu] *** Process received signal ***
[Ubuntu] Signal: Segmentation fault (11)
Do you engage with this problem?
Thanks

Quote:
Originally Posted by dewey View Post
Yes, solids4foam can be used in parallel.
You have to use
decomposePar -region fluid
decomposePar -region solid

Then..
mpirun -np "cores" solids4foam -parallel
It seen, the problem comes from subSetMotionSolver that not work with parallel. Can we use another dynamicMeshSolver in this case?

Last edited by Hgholami; March 31, 2021 at 23:51. Reason: more detail
Hgholami is offline  

Old   June 8, 2021, 06:02
Default Compiled Solids4foam on FE41, is solids4foam suppose to be slow?
  #604
Senior Member
 
Sultan Islam
Join Date: Dec 2015
Location: Canada
Posts: 143
Rep Power: 11
EternalSeekerX is on a distinguished road
Hello everyone,

So I succdsfully compiled both Fe41 and solids4foam on my Fedora 34 system after manually compiling gcc-7.5 as well. Howver when I tried the hron turek laminar case I notice it takes a long time after 2 seconds, so the full 6 would take awhile. Is solids4foam not good on fe41? Unfortunately I have more issues installing fe40 than fe41. Everything works in fe41, even with its exit issue with gamg when used to solve for p, all solutions work. Fe40 can't compile due to failing on parmetis. I really hope solids4foam works as well on fe41 as fe40 xD
EternalSeekerX is offline  

Old   June 8, 2021, 06:40
Default Compiled Solids4foam on FE41, is solids4foam suppose to be slow?
  #605
Senior Member
 
Sita Drost
Join Date: Mar 2009
Location: Arnhem, The Netherlands
Posts: 227
Rep Power: 18
sita is on a distinguished road
Hi there,

Are you getting any warnings/errors when running this case?

I haven't used solids4foam and foam-extend for a while, but there used to be problems with foam-extend-4.1 and pisoFoam/pimpleFoam (one of which is used in the Hron-Turek Laminar case as far as I remember). See also this post (Cases diverging in foam-extend-4.1) and the links in it.

If this is the problem you're running into, you may want to give installing fe40 another try. I vaguely remember having trouble compiling fe40 on CentOS and Ubuntu systems (the parmetis error you mention definitely sounds familiar...), but for the life of me I can't remember how I resolved this, really sorry. Did you use these instructions?

Good luck,
Sita
bigphil likes this.
sita is offline  

Old   June 8, 2021, 14:19
Red face I see..
  #606
Senior Member
 
Sultan Islam
Join Date: Dec 2015
Location: Canada
Posts: 143
Rep Power: 11
EternalSeekerX is on a distinguished road
Quote:
Originally Posted by sita View Post
Hi there,

Are you getting any warnings/errors when running this case?

I haven't used solids4foam and foam-extend for a while, but there used to be problems with foam-extend-4.1 and pisoFoam/pimpleFoam (one of which is used in the Hron-Turek Laminar case as far as I remember). See also this post (Cases diverging in foam-extend-4.1) and the links in it.

If this is the problem you're running into, you may want to give installing fe40 another try. I vaguely remember having trouble compiling fe40 on CentOS and Ubuntu systems (the parmetis error you mention definitely sounds familiar...), but for the life of me I can't remember how I resolved this, really sorry. Did you use these instructions?

Good luck,
Sita
I have seen a ticket opened in the soundforge regarding that but no my run of the tutorial had no warning or abrupt diverging. It was taking to long so I stopped around 2.06 seconds. I did have a mpi exit error but that's because of the known gamg issue when solving for p, all the solutions were written. So no other issues at all beside the slow time to solve. FE41 works in all other regards for me. I'm on Fedora 34 though, and my error with fe40 is that libParmetisdecomp won't even compile (error regarding data types for some reason)
I did get to use fe40 before on centos7 but whenever I run any case with hierarchical or patchedConstraints decomposition, i get foam warning about not being able to load parmetisdecomp.so even though it exists xD.
EternalSeekerX is offline  

Old   June 9, 2021, 02:23
Default Compiled Solids4foam on FE41, is solids4foam suppose to be slow?
  #607
Senior Member
 
Sita Drost
Join Date: Mar 2009
Location: Arnhem, The Netherlands
Posts: 227
Rep Power: 18
sita is on a distinguished road
It could also simply be due to the fact that fluid-solid coupling is turned on at t = 2 s
sita is offline  

Old   June 9, 2021, 02:33
Default
  #608
Senior Member
 
Sultan Islam
Join Date: Dec 2015
Location: Canada
Posts: 143
Rep Power: 11
EternalSeekerX is on a distinguished road
Quote:
Originally Posted by sita View Post
It could also simply be due to the fact that fluid-solid coupling is turned on at t = 2 s
Yes most likely. I was able to compile FE40 by brute-forcing it read third-party opempi of fe41 since systemmpi is too new for FE40. And for the same time step, FE40 was indeed faster. However in term of the data points, they are all similar between FE40 and FE41 (about 0.000X difference)??? I was under the impression it should blow up or give weird values? The only other issues with FE41 i have is the ticket you mentioned with pisofoam. However it makes me wonder if its just the tutorial that is at fault and not the solver itself? Maybe that is why dev haven't bothered fixing it? Maybe someone else can chime in?

For more info, I compiled both FE41 and FE40 using master branch. Both were pulled around last week so the week of May 30th to June 5th. Maybe solids4foam was modified to work with FE41 now (despite it being slower)?
EternalSeekerX is offline  

Old   June 9, 2021, 13:22
Default
  #609
Super Moderator
 
bigphil's Avatar
 
Philip Cardiff
Join Date: Mar 2009
Location: Dublin, Ireland
Posts: 1,097
Rep Power: 34
bigphil will become famous soon enoughbigphil will become famous soon enough
Quote:
Originally Posted by EternalSeekerX View Post
Hello everyone,

So I succdsfully compiled both Fe41 and solids4foam on my Fedora 34 system after manually compiling gcc-7.5 as well. Howver when I tried the hron turek laminar case I notice it takes a long time after 2 seconds, so the full 6 would take awhile. Is solids4foam not good on fe41? Unfortunately I have more issues installing fe40 than fe41. Everything works in fe41, even with its exit issue with gamg when used to solve for p, all solutions work. Fe40 can't compile due to failing on parmetis. I really hope solids4foam works as well on fe41 as fe40 xD
A few comments/questions:
  • How long did the Hron Turek case take for you? It is quite a slow case anyway. You can try the beamInCrossFlow cases if you are looking for quick ones.
  • FE40 (and OpenFOAM in general) works fine even when some third-party packages like parMetis are missing; parMetis is not required e.g. do blockMesh and icoFoam still work? If so, the solids4foam will compile. In fact I rarely ever install parMetis since I don't use it (I guess it is only used for big cases where snappyHexMesh needs to run in parallel).
  • You can find some discussion on the forum about differences between FE41 and FE40 for solids4foam.
bigphil is offline  

Old   June 9, 2021, 15:43
Default Answers for your Q
  #610
Senior Member
 
Sultan Islam
Join Date: Dec 2015
Location: Canada
Posts: 143
Rep Power: 11
EternalSeekerX is on a distinguished road
Quote:
Originally Posted by bigphil View Post
A few comments/questions:
  • How long did the Hron Turek case take for you? It is quite a slow case anyway. You can try the beamInCrossFlow cases if you are looking for quick ones.
  • FE40 (and OpenFOAM in general) works fine even when some third-party packages like parMetis are missing; parMetis is not required e.g. do blockMesh and icoFoam still work? If so, the solids4foam will compile. In fact I rarely ever install parMetis since I don't use it (I guess it is only used for big cases where snappyHexMesh needs to run in parallel).
  • You can find some discussion on the forum about differences between FE41 and FE40 for solids4foam.
Hey Phil,

So I ran huronturekfsi3 laminar case untill T=2.12 seconds for a test, FE40 completed it in 830.3 seconds while FE41 completed it in 1388.6 seconds. They look similar to me when I open them in paraview. I checked the solutions in a few time folders and the values only differ between 0.000X, I don't know which one is more accurate however. I brute force compiled FE40 using third party openmpi and I had no build issues (it complains about not being able to load decomp so files but thats only for a few tutotrials it seems), compiled solids4foam and it compiled correctly too. So I believe both FE41 and FE40 work fully and they both compiled solids4foam. However I don't know which one works better with solids4foam? Sure solids4foam is slow for the one case I tested, but I wonder which one is more accurate? FE40 is pretty old, so generally the FE team suggested i stick to FE41, but it seems the issue is still open? maybe the issue is a per tutorial base? I also tried the overset tutorial in beamincrossflow, it ran well and finished in 396.13 seconds (i assume it ran in serial), and it uses pimple solver and gamg, and despite a misbehaved exit, the numbers look good and nothing overflowed. One thing i do notice in FE41 is I get cat warnings about not being able to find files in system/, i assume that is not an issue.

I know you write and test for both FE40 and FE41 support, but in your experience, which Foam provides the best experience for solids4foam?

Last edited by EternalSeekerX; June 9, 2021 at 16:01. Reason: ran a new tut
EternalSeekerX is offline  

Old   June 9, 2021, 17:31
Default
  #611
Super Moderator
 
bigphil's Avatar
 
Philip Cardiff
Join Date: Mar 2009
Location: Dublin, Ireland
Posts: 1,097
Rep Power: 34
bigphil will become famous soon enoughbigphil will become famous soon enough
Hi Sultan,

Thanks for noting your experience.

Currently FE40 is still my default. The main differences with FE41 are the fluid model implementations and the inclusion of overset. Unfortunately I can't say which one is better as I have not done exhaustive tests; you may like to read others comments.

I wonder why the FE401 case is so much slower, is it the fluid or the solid or something else (fluid mesh motion even) ...

Yep the overset case is set up to run in serial; it should also work in parallel but as noted in another thread it is currently broken in parallel.

As regards the FE41 warnings, yep these can be ignored, although it would be nice if we dealt with them in a better way.
bigphil is offline  

Old   June 9, 2021, 18:01
Default Appreciate the confirmation
  #612
Senior Member
 
Sultan Islam
Join Date: Dec 2015
Location: Canada
Posts: 143
Rep Power: 11
EternalSeekerX is on a distinguished road
Quote:
Originally Posted by bigphil View Post
Hi Sultan,

Thanks for noting your experience.

Currently FE40 is still my default. The main differences with FE41 are the fluid model implementations and the inclusion of overset. Unfortunately I can't say which one is better as I have not done exhaustive tests; you may like to read others comments.

I wonder why the FE401 case is so much slower, is it the fluid or the solid or something else (fluid mesh motion even) ...

Yep the overset case is set up to run in serial; it should also work in parallel but as noted in another thread it is currently broken in parallel.

As regards the FE41 warnings, yep these can be ignored, although it would be nice if we dealt with them in a better way.
Glad to see the warnings can be ignored. And yes I did run into that issue. For me I couldn't decomposePar -region solid as it says the 0 directory is empty. And yes the run breaks, seems like it's missing some extra information that it can't reference some object. Otherwise seriel run was clean. Otherwise seems FE41 is good so far. I'm still learning foam on my own time as I'm interested in the simulation. So I may stick with Fe41 for overset features and fe40 for everything else. I'm just glad I got them to compile on Fedora 34 since for esi and foundation openfoam, I notice they run significantly faster than Ubuntu or centos for my hardware. Plus I have a docker container for running starccm+ and ansys fluent. It's a decent workstation os.

For anyone else who wants to try, I suggest compiling gcc7.5 first and then compile gcc-5.5 with gcc-7.5. Then before compiling fe40 or fe41, just make sure to export path and ld_library_path for gcc depending on which fe you build. I haven't tried compiling my own openmpi outside of third party provided ones
EternalSeekerX is offline  

Old   August 12, 2021, 00:50
Default Strange Log File
  #613
Member
 
Mike Tree
Join Date: Feb 2016
Location: Charlotte, NC
Posts: 37
Rep Power: 10
treem22 is on a distinguished road
I'm seeing some strange log.solids4Foam behavior. I've successfully installed v1912 and compiled solids4Foam along side it. I can run the HronTurekFsi3/ras tutorial and see in its log.solids4Foam file the following:
Code:
Time = 0.001

Setting traction on solid interfaces
Interpolating face values using AMI
Create AMI zone-to-zone interpolator
AMI: Creating addressing and weights between 84 source faces and 216 target faces
AMI: using globalPolyPatch
AMI: Patch source sum(weights) min:1 max:1 average:1
AMI: Patch target sum(weights) min:1 max:1 average:1
interface-to-interface face error: 0.00333563
Total force on fluid interface 0: (-0.0680452 -2.09831e-09 7.05957e-24)
Total force on solid interface 0: (0.0680452 2.09831e-09 -7.05957e-24)

Evolving solid solver
setCellDisplacements: reading cellDisplacements
Corr 0, relative residual = 0
FDICPCG: Solving for D, Initial residual = 0, Final residual = (0 0 0), No outer iterations = 0
 Max relative residual = 0, Relative residual = 0, enforceLinear = false
Interpolating point values using AMI
zoneA point orientation (< 0), max: -0.707107, min: -1, nIncorrectPoints: 0/170
Interpolating point values using AMI
FSI relative residual1 norm for interface 0: 0
FSI residual2 norm for interface 0: 0

Time = 0.001, iteration: 1
Modes before clean-up (plate): 0, modes after clean-up (plate): 0
Current fsi under-relaxation factor (plate): 0.05
Maximal accumulated displacement of interface 0: 0
GAMG:  Solving for cellMotionUx, Initial residual = 0, Final residual = 0, No Iterations 1
GAMG:  Solving for cellMotionUy, Initial residual = 0, Final residual = 0, No Iterations 1
Evolving fluid model: pimpleFluid
volume continuity errors : sum local = 0, global = 0
Courant Number mean: 0.0858922 max: 1.3495 velocity magnitude: 3.17842
PIMPLE: iteration 1
DILUPBiCG:  Solving for Ux, Initial residual = 1, Final residual = 5.13036e-08, No Iterations 7
DILUPBiCG:  Solving for Uy, Initial residual = 1, Final residual = 8.42755e-09, No Iterations 8
GAMG:  Solving for p, Initial residual = 1, Final residual = 8.84293e-07, No Iterations 60
time step continuity errors : sum local = 7.5779e-10, global = -9.93387e-11, cumulative = -9.93387e-11
DILUPBiCG:  Solving for omega, Initial residual = 0.00746969, Final residual = 1.1339e-07, No Iterations 4
DILUPBiCG:  Solving for k, Initial residual = 1, Final residual = 1.9442e-07, No Iterations 6
Everything looks great here. I can see the solid solver working (though it doesn't do much at the first iteration). I can see the fluid cellMotion solver working. I can see the velocity being solved, and the pressure.

But when I try to run my own case (flow through the carotid artery bifurcation), I see the following:
Code:
Time = 5.88235e-05

Setting traction on solid interfaces
Interpolating face values using AMI
Create AMI zone-to-zone interpolator
AMI: Creating addressing and weights between 433520 source faces and 454672 target faces
AMI: using globalPolyPatch
AMI: Patch source sum(weights) min:0.506174 max:1.01881 average:0.999339
AMI: Patch target sum(weights) min:0.147987 max:1.02352 average:0.999352
interface-to-interface face error: 0.000176405
Total force on fluid interface 0: (-0.0546705 0.000429176 -0.120704)
Total force on solid interface 0: (0.0552018 -0.000857259 0.119434)

Evolving solid solver
Solving the momentum equation for D
[1] setCellDisplacements: proc 1 has 0 cells with setDisplacements
[2] setCellDisplacements: proc 2 has 0 cells with setDisplacements
[3] setCellDisplacements: proc 3 has 0 cells with setDisplacements
[4] setCellDisplacements: proc 4 has 0 cells with setDisplacements
[5] setCellDisplacements: proc 5 has 0 cells with setDisplacements
[6] setCellDisplacements: proc 6 has 0 cells with setDisplacements
[7] setCellDisplacements: proc 7 has 0 cells with setDisplacements
[8] setCellDisplacements: proc 8 has 0 cells with setDisplacements
[9] setCellDisplacements: proc 9 has 0 cells with setDisplacements
[10] setCellDisplacements: proc 10 has 0 cells with setDisplacements
[11] setCellDisplacements: proc 11 has 0 cells with setDisplacements
[12] setCellDisplacements: proc 12 has 0 cells with setDisplacements
[13] setCellDisplacements: proc 13 has 0 cells with setDisplacements
[14] setCellDisplacements: proc 14 has 0 cells with setDisplacements
[15] setCellDisplacements: proc 15 has 0 cells with setDisplacements
setCellDisplacements: reading cellDisplacements
[0] setCellDisplacements: proc 0 has 0 cells with setDisplacements
    Corr, res, relRes, matRes, iters
    100, 0.000333817, 0.000156063, 0, 7
    200, 1.58933e-05, 8.89816e-06, 0, 7
    Both residuals have converged
    297, 9.88991e-07, 5.57875e-07, 0, 7

Interpolating point values using AMI
zoneA point orientation (< 0), max: -0.894023, min: -1, nIncorrectPoints: 0/31103
Interpolating point values using AMI
FSI relative residual1 norm for interface 0: 1
FSI residual2 norm for interface 0: 1

Time = 5.88235e-05, iteration: 1
Current fsi under-relaxation factor (fixed): 0.001
Maximal accumulated displacement of interface 0: 0.00198066
Evolving fluid model: pimpleFluid
volume continuity errors : sum local = 6.92067e-15, global = 2.45978e-18
Courant Number mean: 0.0371906 max: 0.999295 velocity magnitude: 0.737557
PIMPLE: iteration 1
GAMG:  Solving for p, Initial residual = 0.891589, Final residual = 8.70122e-07, No Iterations 42
GAMG:  Solving for p, Initial residual = 0.00690861, Final residual = 8.39627e-07, No Iterations 12
time step continuity errors : sum local = 1.95105e-08, global = -7.59086e-11, cumulative = -7.59086e-11
GAMG:  Solving for p, Initial residual = 0.0489028, Final residual = 9.67288e-07, No Iterations 31
GAMG:  Solving for p, Initial residual = 0.00354314, Final residual = 7.74235e-07, No Iterations 10
time step continuity errors : sum local = 1.71419e-08, global = -6.98734e-11, cumulative = -1.45782e-10
GAMG:  Solving for p, Initial residual = 0.0515685, Final residual = 7.94962e-07, No Iterations 32
GAMG:  Solving for p, Initial residual = 0.00373174, Final residual = 7.94989e-07, No Iterations 10
time step continuity errors : sum local = 1.75528e-08, global = -7.27876e-11, cumulative = -2.1857e-10
Here, I see the solid solver working. I'm using a different solid model, so it stands to reason that the output is a bit different here. I also see the pressure output. BUT I see NO fluid CellMotion output and NO velocity solver output. Why not?!?!

Here are my relevant dictionaries:
Code:
solvers
{
    cellMotionU
    {
        solver                 GAMG;
        tolerance              1e-6;
        relTol                 0;
        minIter                3;
        maxIter                1000;
        smoother               GaussSeidel;
        nPreSweeps             0;
        nPostSweeps            2;
        nFinestSweeps          2;
        scaleCorrection        true;
        directSolveCoarsest    false;
        cacheAgglomeration     true;
        nCellsInCoarsestLevel  20;
        agglomerator           faceAreaPair;
        mergeLevels            1;
    }
    "(U|UFinal)"
    {
        solver           PBiCG;
        preconditioner   DILU;
        tolerance        1e-08;
        relTol           0;
	minIter          3;
    }
    "(p|pFinal|pcorr|pcorrFinal)"
    {
        solver                 GAMG;
        tolerance              1e-06;
        relTol                 0;
        minIter                3;
        maxIter                1000;
        smoother               GaussSeidel;
        nPreSweeps             0;
        nPostSweeps            2;
        cacheAgglomeration     true;
        nCellsInCoarsestLevel  700;
        agglomerator           faceAreaPair;
        mergeLevels            1;
    }
}

PIMPLE
{
    momentumPredictor         yes;

    nOuterCorrectors          20;
    nCorrectors               3; 
    nNonOrthogonalCorrectors  1; 

//    residualControl
//    {
//        U
//        {
//                relTol     0;
//                tolerance  1e-06;
//        }
//        p
//        {
//                relTol     0;
//                tolerance  1e-03;
//        }
//     }
}

relaxationFactors
{
    fields
    {
        p      0.5;
    }
    equations
    {
        U      0.7;
    }
}
Code:
ddtSchemes
// 0: Euler, 1: Crank-Nicolson; a value of 0.9 is a good compromise between accuracy and robustness
{
    default  CrankNicolson  0.9;
}

gradSchemes
// increase scalar with worse meshes (higher non-orthogonality)
{
    default                             cellMDLimited Gauss linear 0.3;
    //grad(U)                             leastSquares; //cellMDLimited Gauss linear 0.5;
    //grad(magSqr(U))                     leastSquares; //cellMDLimited Gauss linear 0.5;
    //grad(p)                             leastSquares; //cellMDLimited Gauss linear 0.5;
    //grad(nuEff)                         leastSquares; //cellMDLimited Gauss linear 0.5;
}

divSchemes
{
    default                        none;
    div(phi,U)                     Gauss linearUpwind grad(U); //Gauss limitedLinear 1.0; Gauss upwind;
    div((nuEff*dev2(T(grad(U)))))  Gauss linear;
}

laplacianSchemes
// increase scalar with better meshes (lower non-orthogonality)
{
    default                Gauss linear limited 0.7;
    //laplacian(nuEff,U)     Gauss limitedLinear phi 1.0 corrected;
    //laplacian(rAU,p)       Gauss limitedLinear phi 1.0 corrected;
}

interpolationSchemes
{
    default                                                    linear;
}

snGradSchemes
// increase scalar with better meshes (lower non-orthogonality)
{
    default  limited 0.7;
}
treem22 is offline  

Old   August 13, 2021, 13:07
Default
  #614
Member
 
Mike Tree
Join Date: Feb 2016
Location: Charlotte, NC
Posts: 37
Rep Power: 10
treem22 is on a distinguished road
Quote:
Originally Posted by treem22 View Post
I'm seeing some strange log.solids4Foam behavior.
I was able to re-create the issue using the HronTurekFsi3/ras tutorlal.

If I modify the tutorial such that it uses the linearGeometry solidModel (instead of the unsNonLinearGeometryTotalLagrangian solidModel), I can make it so the log.solids4foam file does not show the velocity or mesh motion residuals. Of course, when I change the solidModel to linearGeometry I also have to change the mechanicalProperties type from neoHookeanElastic to linearElastic.

Through testing I have observed a few things:
  • The outer iterations will still converge before reaching their max number of iterations.
  • The velocity is still changing from timestep to timestep
  • Changing the momentumPredictor setting did nothing. I cannot tell whether the momentumPredictor step is happening in every outer iteration. The outer iterations may converge without it simply because the first instance of the velocity solve is low enough for the residualControl.
treem22 is offline  

Old   August 16, 2021, 09:14
Default
  #615
Super Moderator
 
bigphil's Avatar
 
Philip Cardiff
Join Date: Mar 2009
Location: Dublin, Ireland
Posts: 1,097
Rep Power: 34
bigphil will become famous soon enoughbigphil will become famous soon enough
Hi Mike,

Can you attach an example log?

Thanks,
Philip
bigphil is offline  

Old   August 16, 2021, 11:04
Default
  #616
Member
 
Mike Tree
Join Date: Feb 2016
Location: Charlotte, NC
Posts: 37
Rep Power: 10
treem22 is on a distinguished road
Quote:
Originally Posted by bigphil View Post
Hi Mike,

Can you attach an example log?

Thanks,
Philip
Of course! I should have done this from the beginning. My apologies. I can't seem to attach the whole log, so here are its first few iterations:

Code:
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | OpenFOAM: The Open Source CFD Toolbox           |
|  \\    /   O peration     | Version:  v1912                                 |
|   \\  /    A nd           | Website:  www.openfoam.com                      |
|    \\/     M anipulation  |                                                 |
\*---------------------------------------------------------------------------*/
Build  : v1912 OPENFOAM=1912
Arch   : "LSB;label=32;scalar=64"
Exec   : solids4Foam
Date   : Aug 13 2021
Time   : 10:17:03
Host   : corvidpost5.corvidtec.com
PID    : 58292
I/O    : uncollated
Case   : /beegfs/users/mtree/OpenFOAM/mtree-v1912/run/solids4foam/tutorials/fluidSolidInteraction/HronTurekFsi3/ras
nProcs : 1
trapFpe: Floating point exception trapping enabled (FOAM_SIGFPE).
fileModificationChecking : Monitoring run-time modified files using timeStampMaster (fileModificationSkew 10)
allowSystemOperations : Allowing user-supplied system call operations

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

--> FOAM Warning : 
    From function static Foam::IOstreamOption::compressionType Foam::IOstreamOption::compressionEnum(const Foam::word&)
    in file db/IOstreams/IOstreams/IOstreamOption.C at line 88
    Unknown compression specifier 'uncompressed', assuming no compression
/*---------------------------------------------------------------------------*\
|    For further information on the solids4Foam toolbox implementations,      |
|    please see the following publications:                                   |
|                                                                             |
|    P. Cardiff, A Karac, P. De Jaeger, H. Jasak, J. Nagy, A. Ivankovic,      |
|    Z. Tukovic: An open-source finite volume toolbox for solid mechanics and |
|    fluid-solid interaction simulations. arXiv:1808.10736v2, 2018, available |
|    at https://arxiv.org/abs/1808.10736.                                     |
|                                                                             |
|    Z. Tukovic, A. Karac, P. Cardiff, H. Jasak, A. Ivankovic: OpenFOAM       |
|    finite volume solver for fluid-solid interaction.  Transactions of       |
|    Famena, 42 (3), pp. 1-31, 2018, 10.21278/TOF.42301.                      |
\*---------------------------------------------------------------------------*/

Selecting physicsModel fluidSolidInteraction

Selecting fluidSolidInterface method IQNILS

Selecting fluidModel pimpleFluid
Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: velocityLaplacian
Selecting motion diffusion: quadratic
Selecting motion diffusion: inverseDistance
Selecting patchDistMethod meshWave
g field not found in constant directory: initialising to zero
Selecting incompressible transport model Newtonian
Selecting turbulence model type RAS
Selecting RAS turbulence model kOmegaSST

PIMPLE: max iterations = 5
    field U	: relTol 0, tolerance 1e-06

Constructing face velocity Uf

Selecting solidModel linearGeometry
Selecting dynamicFvMesh staticFvMesh
Creating solidTraction boundary condition
    limiter coefficient: 1
Creating fixedDisplacement boundary condition
Creating solidTraction boundary condition
    limiter coefficient: 1
Creating fixedDisplacement boundary condition
    under-relaxation method: fixed
Creating the mechanicalModel
Selecting mechanical law linearElastic
additionalMeshCorrection: false
Selecting interfaceToInterfaceMapping RBF
Creating pointDisp function object
    region = solid
    distance from specified point is 0.0075
--> FOAM Warning : 
    From function void Foam::timeControl::read(const Foam::dictionary&)
    in file db/functionObjects/timeControl/timeControl.C at line 127
    Reading "/beegfs/users/mtree/OpenFOAM/mtree-v1912/run/solids4foam/tutorials/fluidSolidInteraction/HronTurekFsi3/ras/system/controlDict.functions.forces"
    Using deprecated 'outputControl'
    Please use 'writeControl' with 'writeInterval'
    This outputControl is deemed to be 42 months old.

--> FOAM IOWarning :
    Found [v1612] 'functionObjectLibs' entry instead of 'libs' in dictionary "/beegfs/users/mtree/OpenFOAM/mtree-v1912/run/solids4foam/tutorials/fluidSolidInteraction/HronTurekFsi3/ras/system/controlDict.functions.forces" 

    This keyword is deemed to be 36 months old.

forces forces:
    p: p
    U: U
    rho: rhoInf
    Freestream density (rhoInf) set to 100
    Not including porosity effects

Time = 0.001

Setting traction on solid interfaces
Interpolating face values using RBF
Create RBF interpolator from plate to plate
    face interpolation error: 6.96921e-12
Total force on fluid interface 0: (-0.0680452 -2.09831e-09 7.05957e-24)
Total force on solid interface 0: (0.0680233 2.10931e-09 -7.14464e-24)

Evolving solid solver
Solving the momentum equation for DD
setCellDisplacements: reading cellDisplacements
    Corr, res, relRes, matRes, iters
    Both residuals have converged
    2, 0, 0, 0, 0

Interpolating point values using RBF
Create RBF interpolator from plate to plate
    point interpolation error: 2.18637e-11
Interpolating point values using RBF
FSI relative residual1 norm for interface 0: 0
FSI residual2 norm for interface 0: 0

Time = 0.001, iteration: 1
Modes before clean-up (plate): 0, modes after clean-up (plate): 0
Current fsi under-relaxation factor (plate): 0.05
Maximal accumulated displacement of interface 0: 0
Evolving fluid model: pimpleFluid
volume continuity errors : sum local = 0, global = 0
Courant Number mean: 0.0858922 max: 1.3495 velocity magnitude: 3.17842
PIMPLE: iteration 1
GAMG:  Solving for p, Initial residual = 1, Final residual = 8.84293e-07, No Iterations 60
time step continuity errors : sum local = 7.5779e-10, global = -9.93387e-11, cumulative = -9.93387e-11
DILUPBiCG:  Solving for omega, Initial residual = 0.00746969, Final residual = 1.1339e-07, No Iterations 4
DILUPBiCG:  Solving for k, Initial residual = 1, Final residual = 1.9442e-07, No Iterations 6
PIMPLE: iteration 2
GAMG:  Solving for p, Initial residual = 0.69127, Final residual = 8.58652e-07, No Iterations 36
time step continuity errors : sum local = 3.53597e-09, global = 1.48623e-10, cumulative = 4.9284e-11
DILUPBiCG:  Solving for omega, Initial residual = 0.00317216, Final residual = 7.04253e-07, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.0256635, Final residual = 5.57283e-07, No Iterations 3
PIMPLE: iteration 3
GAMG:  Solving for p, Initial residual = 0.553591, Final residual = 8.68691e-07, No Iterations 33
time step continuity errors : sum local = 2.18761e-09, global = -1.27332e-10, cumulative = -7.80484e-11
DILUPBiCG:  Solving for omega, Initial residual = 0.00145954, Final residual = 5.553e-07, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.00607154, Final residual = 1.21916e-07, No Iterations 3
PIMPLE: iteration 4
GAMG:  Solving for p, Initial residual = 0.354838, Final residual = 6.29328e-07, No Iterations 31
time step continuity errors : sum local = 1.40039e-09, global = 5.00682e-11, cumulative = -2.79801e-11
DILUPBiCG:  Solving for omega, Initial residual = 0.000489358, Final residual = 7.22081e-08, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.00292104, Final residual = 8.31175e-08, No Iterations 3
PIMPLE: iteration 5
GAMG:  Solving for p, Initial residual = 0.21348, Final residual = 8.30932e-07, No Iterations 28
time step continuity errors : sum local = 1.67403e-09, global = -8.97173e-11, cumulative = -1.17697e-10
DILUPBiCG:  Solving for omega, Initial residual = 0.000218517, Final residual = 2.22571e-07, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.000900964, Final residual = 3.65141e-07, No Iterations 2
PIMPLE: not converged within 5 iterations
Setting traction on solid interfaces
Interpolating face values using RBF
Total force on fluid interface 0: (-13.1233 4.9976 6.35845e-24)
Total force on solid interface 0: (13.2806 -4.99607 -6.30936e-24)

Evolving solid solver
Solving the momentum equation for DD
    Corr, res, relRes, matRes, iters
    Both residuals have converged
    2, 0, 0, 0, 0

Interpolating point values using RBF
Interpolating point values using RBF
FSI relative residual1 norm for interface 0: 0
FSI residual2 norm for interface 0: 0
ExecutionTime = 0.71 s  ClockTime = 1 s

forces forces write:
    Sum of forces
        Total    : (1.31233 -0.49976 -6.35845e-25)
        Pressure : (1.30557 -0.49976 0)
        Viscous  : (0.00675602 -4.63457e-07 -6.35845e-25)
    Sum of moments
        Total    : (0.0037482 0.00984245 -0.0933525)
        Pressure : (0.0037482 0.00979178 -0.0926768)
        Viscous  : (3.47593e-09 5.06702e-05 -0.000675656)


Time = 0.002

Setting traction on solid interfaces
Interpolating face values using RBF
Total force on fluid interface 0: (-13.1233 4.9976 6.35845e-24)
Total force on solid interface 0: (13.2806 -4.99607 -6.30936e-24)

Evolving solid solver
Solving the momentum equation for DD
    Corr, res, relRes, matRes, iters
    Both residuals have converged
    2, 0, 0, 0, 0

Interpolating point values using RBF
Interpolating point values using RBF
FSI relative residual1 norm for interface 0: 0
FSI residual2 norm for interface 0: 0

Time = 0.002, iteration: 1
Modes before clean-up (plate): 0, modes after clean-up (plate): 0
Current fsi under-relaxation factor (plate): 0.05
Maximal accumulated displacement of interface 0: 0
Evolving fluid model: pimpleFluid
volume continuity errors : sum local = 0, global = 0
Courant Number mean: 0.0858303 max: 1.1739 velocity magnitude: 6.2833
PIMPLE: iteration 1
GAMG:  Solving for p, Initial residual = 0.541293, Final residual = 7.79996e-07, No Iterations 55
time step continuity errors : sum local = 1.13626e-09, global = 1.16336e-10, cumulative = -1.36125e-12
DILUPBiCG:  Solving for omega, Initial residual = 0.00288384, Final residual = 3.66312e-07, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.145397, Final residual = 2.5092e-07, No Iterations 3
PIMPLE: iteration 2
GAMG:  Solving for p, Initial residual = 0.548259, Final residual = 8.18762e-07, No Iterations 36
time step continuity errors : sum local = 1.00222e-09, global = -4.06895e-11, cumulative = -4.20508e-11
DILUPBiCG:  Solving for omega, Initial residual = 0.00051627, Final residual = 6.80607e-08, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.00253334, Final residual = 2.04936e-07, No Iterations 2
PIMPLE: iteration 3
GAMG:  Solving for p, Initial residual = 0.301428, Final residual = 9.80541e-07, No Iterations 37
time step continuity errors : sum local = 8.87878e-10, global = 1.70477e-10, cumulative = 1.28426e-10
DILUPBiCG:  Solving for omega, Initial residual = 0.000147071, Final residual = 2.64034e-08, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.000755549, Final residual = 6.44334e-09, No Iterations 3
PIMPLE: iteration 4
GAMG:  Solving for p, Initial residual = 0.133918, Final residual = 7.73202e-07, No Iterations 30
time step continuity errors : sum local = 5.95032e-10, global = 6.64192e-11, cumulative = 1.94845e-10
DILUPBiCG:  Solving for omega, Initial residual = 7.9708e-05, Final residual = 2.34647e-08, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 0.000156515, Final residual = 6.49238e-07, No Iterations 1
PIMPLE: iteration 5
GAMG:  Solving for p, Initial residual = 0.0557479, Final residual = 9.2336e-07, No Iterations 21
time step continuity errors : sum local = 6.8105e-10, global = -1.32359e-10, cumulative = 6.24855e-11
DILUPBiCG:  Solving for omega, Initial residual = 4.42068e-05, Final residual = 1.57016e-08, No Iterations 2
DILUPBiCG:  Solving for k, Initial residual = 7.48789e-05, Final residual = 1.42411e-08, No Iterations 2
PIMPLE: not converged within 5 iterations
Setting traction on solid interfaces
Interpolating face values using RBF
Total force on fluid interface 0: (3.35572 -2.47967 2.40853e-24)
Total force on solid interface 0: (-3.39701 2.47888 -2.49446e-24)

Evolving solid solver
Solving the momentum equation for DD
    Corr, res, relRes, matRes, iters
    Both residuals have converged
    2, 0, 0, 0, 0
treem22 is offline  

Old   August 16, 2021, 19:42
Default RobinFsi3dTube in foam-extend-4.0
  #617
Member
 
Mike Tree
Join Date: Feb 2016
Location: Charlotte, NC
Posts: 37
Rep Power: 10
treem22 is on a distinguished road
In response to the log file issue that showed up using solids4Foam and OFv1912, I decided to compile foam-extend-4.0 and try my luck at solids4Foam over there as well. In particular, I'm interested in the Robin boundary condition. My application is FSI of the common carotid bifurcation, so I have the classic added mass problem that arises when fluid and solid densities are relatively close in value. I successfully compiled foam-extend 4.0 and solids4Foam. I successfully ran the RobinFsi3dTube tutorial. Then, I swapped out the tutorial mesh for my mesh and modified the boundary conditions to match my boundary names and account for the fact that I have a few more boundaries in play (1 inlet, 2 outlets instead of 1 inlet, 1 outlet).

Everything was going fine until the first iteration tried to actually solve the pressure equation. Then, along came a segmentation fault.

So, I recompiled solids4Foam using some debug compiler flags (-g and -O0, GNU 5.4 compiler), and I ran the GNU debugger to find more info.

Here's what I found:
Code:
GNU gdb (GDB) Red Hat Enterprise Linux 7.6.1-115.el7
Copyright (C) 2013 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-redhat-linux-gnu".
For bug reporting instructions, please see:
<http://www.gnu.org/software/gdb/bugs/>...
Reading symbols from /beegfs/users/mtree/foam/mtree-4.0/applications/bin/linux64GccDPOpt/solids4Foam...done.
(gdb) run
Starting program: /beegfs/users/mtree/foam/mtree-4.0/applications/bin/linux64GccDPOpt/solids4Foam 
warning: File "/home/shelf1/Software/gcc/5.4.0/lib64/libstdc++.so.6.0.21-gdb.py" auto-loading has been declined by your `auto-load safe-path' set to "$debugdir:$datadir/auto-load:/usr/bin/mono-gdb.py".
To enable execution of this file add
	add-auto-load-safe-path /home/shelf1/Software/gcc/5.4.0/lib64/libstdc++.so.6.0.21-gdb.py
line to your configuration file "/beegfs/users/mtree/.gdbinit".
To completely disable this security protection add
	set auto-load safe-path /
line to your configuration file "/beegfs/users/mtree/.gdbinit".
For more information about this security protection see the
"Auto-loading safe path" section in the GDB manual.  E.g., run from the shell:
	info "(gdb)Auto-loading safe path"
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
/*---------------------------------------------------------------------------*\
| =========                 |                                                 |
| \\      /  F ield         | foam-extend: Open Source CFD                    |
|  \\    /   O peration     | Version:     4.0                                |
|   \\  /    A nd           | Web:         http://www.foam-extend.org         |
|    \\/     M anipulation  | For copyright notice see file Copyright         |
\*---------------------------------------------------------------------------*/
Build    : 4.0-268bb07d15d8
Exec     : /beegfs/users/mtree/foam/mtree-4.0/applications/bin/linux64GccDPOpt/solids4Foam
Date     : Aug 16 2021
Time     : 18:12:21
Host     : corvidpost5.corvidtec.com
PID      : 49378
CtrlDict : "/beegfs/users/mtree/foam/mtree-4.0/run/uncp/system/controlDict"
Case     : /beegfs/users/mtree/foam/mtree-4.0/run/uncp
nProcs   : 1
SigFpe   : Enabling floating point exception trapping (FOAM_SIGFPE).

// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
Create time

/*---------------------------------------------------------------------------*\
|    For further information on the solids4Foam toolbox implementations,      |
|    please see the following publications:                                   |
|                                                                             |
|    P. Cardiff, A Karac, P. De Jaeger, H. Jasak, J. Nagy, A. Ivankovic,      |
|    Z. Tukovic: An open-source finite volume toolbox for solid mechanics and |
|    fluid-solid interaction simulations. arXiv:1808.10736v2, 2018, available |
|    at https://arxiv.org/abs/1808.10736.                                     |
|                                                                             |
|    Z. Tukovic, A. Karac, P. Cardiff, H. Jasak, A. Ivankovic: OpenFOAM       |
|    finite volume solver for fluid-solid interaction.  Transactions of       |
|    Famena, 42 (3), pp. 1-31, 2018, 10.21278/TOF.42301.                      |
\*---------------------------------------------------------------------------*/

Selecting physicsModel fluidSolidInteraction

Selecting fluidSolidInterface method Aitken

Selecting fluidModel unsIcoFluid
Selecting dynamicFvMesh dynamicMotionSolverFvMesh
Selecting motion solver: velocityLaplacian
Selecting motion diffusion: quadratic
Selecting motion diffusion: inverseDistance
g field not found in constant directory: initialising to zero
Selecting solidModel nonLinearGeometryUpdatedLagrangian
Selecting dynamicFvMesh staticFvMesh
Creating fixedDisplacement boundary condition
Creating solidTraction boundary condition
    limiter coefficient: 1
Creating solidTraction boundary condition
    limiter coefficient: 1
Creating fixedDisplacement boundary condition
Creating fixedDisplacement boundary condition
    under-relaxation method: fixed
Creating the mechanicalModel
Selecting mechanical law StVenantKirchhoffElastic
additionalMeshCorrection: false
Selecting interfaceToInterfaceMapping GGI
Time = 1e-06

Setting traction on solid interfaces
Interpolating face values using GGI
Create GGI zone-to-zone interpolator
interface-to-interface face error: 0.0195445
calcMasterPointAddressing() const
Extended GGI, master point distance, max: 2.61897e-05, avg: -1.0627e-06, min: -3.06322e-05
interface-to-interface point error: 3.06322e-05
Number of uncovered master faces: 0
Number of uncovered slave faces: 0

Total force on fluid interface 0: (0 0 0)
Total force on solid interface 0: (0 0 0)

Evolving solid solver
Solving the updated Lagrangian form of the momentum equation for DD
setCellDisplacements: reading cellDisplacements
    Corr, res, relRes, matRes, iters
    Both residuals have converged
    2, 0, 0, 0, 0

Interpolating point values using GGI
Interpolating point values using GGI
FSI relative residual1 norm for interface 0: 0
FSI residual2 norm for interface 0: 0

Time = 1e-06, iteration: 1
Current fsi under-relaxation factor (fixed): 0.1
Setting acceleration at fluid side of the interface
Interpolating face values using GGI
Maximal accumulated displacement of interface 0: 0
GAMG:  Solving for cellMotionUx, Initial residual = 0, Final residual = 0, No Iterations 1
GAMG:  Solving for cellMotionUy, Initial residual = 0, Final residual = 0, No Iterations 1
GAMG:  Solving for cellMotionUz, Initial residual = 0, Final residual = 0, No Iterations 1
GAMG:  Solving for cellMotionUx, Initial residual = 0, Final residual = 0, No Iterations 1
GAMG:  Solving for cellMotionUy, Initial residual = 0, Final residual = 0, No Iterations 1
GAMG:  Solving for cellMotionUz, Initial residual = 0, Final residual = 0, No Iterations 1
Evolving fluid model with consistent strategy: unsIcoFluid
Courant Number mean: 0 max: 0.00327424 velocity magnitude: 0.290041

PISO: Operating solver in PISO mode

DILUPBiCG:  Solving for Ux, Initial residual = 1, Final residual = 1.09066e-15, No Iterations 2
DILUPBiCG:  Solving for Uy, Initial residual = 1, Final residual = 2.68135e-15, No Iterations 2
DILUPBiCG:  Solving for Uz, Initial residual = 1, Final residual = 7.04321e-16, No Iterations 2

Program received signal SIGSEGV, Segmentation fault.
0x00002aaaaad06730 in Foam::List<double>::List(Foam::List<double> const&) ()
   from /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/lib/linux64GccDPOpt/libincompressibleTurbulenceModel.so
Missing separate debuginfos, use: debuginfo-install glibc-2.17-292.el7.x86_64 numactl-libs-2.0.12-3.el7_7.1.x86_64 sssd-client-1.16.4-21.el7_7.3.x86_64 zlib-1.2.7-18.el7.x86_64
(gdb) where
#0  0x00002aaaaad06730 in Foam::List<double>::List(Foam::List<double> const&) ()
   from /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/lib/linux64GccDPOpt/libincompressibleTurbulenceModel.so
#1  0x00002aaaac4af6f7 in Foam::Field<double>::Field (this=0x7fffffff5c80, f=...)
    at /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/src/foam/lnInclude/Field.C:149
#2  0x00002aaaac7fc327 in Foam::elasticWallPressureFvPatchScalarField::updateCoeffs (this=0x6a3fd80)
    at fluidModels/fvPatchFields/elasticWallPressure/elasticWallPressureFvPatchScalarField.C:159
#3  0x00002aaaab034c7e in Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh>::GeometricBoundaryField::updateCoeffs() ()
   from /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/lib/linux64GccDPOpt/libincompressibleRASModels.so
#4  0x00002aaaab039196 in Foam::fvMatrix<double>::fvMatrix(Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&, Foam::dimensionSet const&) ()
   from /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/lib/linux64GccDPOpt/libincompressibleRASModels.so
#5  0x00002aaaae277db3 in Foam::fv::gaussLaplacianScheme<double, double>::fvmLaplacianUncorrected(Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&) ()
   from /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/lib/linux64GccDPOpt/libfiniteVolume.so
#6  0x00002aaaae2714e5 in Foam::fv::gaussLaplacianScheme<double, double>::fvmLaplacian(Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&, Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> const&) ()
   from /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/lib/linux64GccDPOpt/libfiniteVolume.so
#7  0x00002aaaac5354c8 in Foam::fvm::laplacian<double, double> (gamma=..., vf=..., name=...)
    at /home/shelf1/motorsports/software/OpenSourceCFD/source/OpenFOAM/foam-extend-4.0/src/finiteVolume/lnInclude/fvmLaplacian.C:282
#8  0x00002aaaac62de53 in Foam::fluidModels::unsIcoFluid::evolveConsistent (this=0x85caa0) at fluidModels/unsIcoFluid/unsIcoFluid.C:489
#9  0x00002aaaac630875 in Foam::fluidModels::unsIcoFluid::evolve (this=0x85caa0) at fluidModels/unsIcoFluid/unsIcoFluid.C:718
#10 0x00002aaaac687ca0 in Foam::fluidSolidInterfaces::AitkenCouplingInterface::evolve (this=0x84c070)
    at fluidSolidInterfaces/AitkenCouplingInterface/AitkenCouplingInterface.C:100
#11 0x0000000000403497 in main (argc=1, argv=0x7fffffff9418) at solids4Foam.C:62
As far as I can determine there is some issue surrounding the rhoSolid scalar field from the fsi solid boundary field. The case is run laminar, so I'm assuming the turbulence routines called are just going to return very basic info. I'm a bit lost after this. I'm not too C++ savvy, and haven't spent much time looking at OpenFOAM or solids4Foam source code. Anyone have any tips on what I'm doing wrong?

I can't seem to attach my whole case directory, so here's a link to it: https://drive.google.com/file/d/1rZ2...ew?usp=sharing
treem22 is offline  

Old   August 18, 2021, 19:16
Default Fix for RobinFsi3dTube Issue
  #618
Member
 
Mike Tree
Join Date: Feb 2016
Location: Charlotte, NC
Posts: 37
Rep Power: 10
treem22 is on a distinguished road
I spent some more time learning C++ and the solids4Foam code, and I found the cause of the seg fault. I even have a fix for it....

The RobinFsi3dTube tutorial has 5 fluid boundary fields and the fluid interface boundary field (named "wall") is the last one. It's index is 4. There are 6 solid boundary fields and the solid interface boundary field (named "inner-wall") is the 5th one. It's index when listed is ALSO 4!

This is a very fortunate coincidence because the updateCoeffs() member function of the elasticWallPressureFvPatchScalarField class (src/solids4FoamModels/fluidModels/fvPatchFields/elasticWallPressure/elasticWallPressureFvPatchScalarField.C) creates a patchID label on line 149. That patchID is defined based on the index of the boundary field that uses the elasticWallPressure boundary condition type. This is defined on the fluid interface, but this index is then used to access the solid interface density and stiffness to compute the p-wave propagation speed and virtual thickness. If the boundary field index of the fluid interface boundary is not the same as the boundary field index of the solid interface boundary then this results in potentially pulling the wrong material properties. In my case, because my mesh had more fluid boundaries than solid boundaries, it resulted in a request for index beyond the size of the list. This resulted in the seg fault.

The solution is to redefine the patchID variable to actually be the soild boundary field interface index. I switched to this and it worked:
Code:
label patchID = fsi.solidPatchIndices()[0];
treem22 is offline  

Old   August 19, 2021, 17:36
Default
  #619
Super Moderator
 
bigphil's Avatar
 
Philip Cardiff
Join Date: Mar 2009
Location: Dublin, Ireland
Posts: 1,097
Rep Power: 34
bigphil will become famous soon enoughbigphil will become famous soon enough
Quote:
Originally Posted by treem22 View Post
I spent some more time learning C++ and the solids4Foam code, and I found the cause of the seg fault. I even have a fix for it....

The RobinFsi3dTube tutorial has 5 fluid boundary fields and the fluid interface boundary field (named "wall") is the last one. It's index is 4. There are 6 solid boundary fields and the solid interface boundary field (named "inner-wall") is the 5th one. It's index when listed is ALSO 4!

This is a very fortunate coincidence because the updateCoeffs() member function of the elasticWallPressureFvPatchScalarField class (src/solids4FoamModels/fluidModels/fvPatchFields/elasticWallPressure/elasticWallPressureFvPatchScalarField.C) creates a patchID label on line 149. That patchID is defined based on the index of the boundary field that uses the elasticWallPressure boundary condition type. This is defined on the fluid interface, but this index is then used to access the solid interface density and stiffness to compute the p-wave propagation speed and virtual thickness. If the boundary field index of the fluid interface boundary is not the same as the boundary field index of the solid interface boundary then this results in potentially pulling the wrong material properties. In my case, because my mesh had more fluid boundaries than solid boundaries, it resulted in a request for index beyond the size of the list. This resulted in the seg fault.

The solution is to redefine the patchID variable to actually be the soild boundary field interface index. I switched to this and it worked:
Code:
label patchID = fsi.solidPatchIndices()[0];
See here: https://bitbucket.org/philip_cardiff...chscalarfieldc
bigphil is offline  

Old   August 21, 2021, 02:04
Default Installation os Solids4Foam on OpenFoam -v2012
  #620
Member
 
Ashutosh
Join Date: Jul 2021
Location: India
Posts: 76
Rep Power: 5
night-hawk is on a distinguished road
Hi everyone,
I was trying to install solids4Foam in Openfoam v2012 . When I executed Allwmake script it ask to replace certain files. I just want to ask that if i replace those will it break original Openfoam functionality.
night-hawk is offline  

Closed Thread


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
GPU Linear Solvers for OpenFOAM gocarts OpenFOAM Announcements from Other Sources 37 August 17, 2022 15:22
[Virtualization] OpenFOAM oriented tutorial on using VMware Player - support thread wyldckat OpenFOAM Installation 2 July 11, 2012 17:01
New OpenFOAM Forum Structure jola OpenFOAM 2 October 19, 2011 07:55
Cross-compiling OpenFOAM 1.7.0 on Linux for Windows 32 and 64bits with Mingw-w64 wyldckat OpenFOAM Announcements from Other Sources 3 September 8, 2010 07:25
OpenFOAM Debian packaging current status problems and TODOs oseen OpenFOAM Installation 9 August 26, 2007 14:50


All times are GMT -4. The time now is 21:03.