CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2

terminate called after throwing an instance of 'int'

Register Blogs Community New Posts Updated Threads Search

Like Tree1Likes
  • 1 Post By pdp.aero

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   April 7, 2014, 08:44
Default terminate called after throwing an instance of 'int'
  #1
New Member
 
Becky
Join Date: Mar 2014
Posts: 6
Rep Power: 12
b614910 is on a distinguished road
Hi,

I have trouble running some test cases:
navierstokes/flatplate
rans/flatplate

They terminated with an error like:
Code:
Traceback (most recent call last):
  File "/home/beck394/su2/mvapich/SU2/bin/parallel_computation.py", line 111, in <module>
    main()
  File "/home/beck394/su2/mvapich/SU2/bin/parallel_computation.py", line 56, in main
    options.divide_grid  )
  File "/home/beck394/su2/mvapich/SU2/bin/parallel_computation.py", line 88, in parallel_computation
    info = SU2.run.CFD(config)
  File "/home/beck394/su2/mvapich/SU2/bin/SU2/run/interface.py", line 93, in CFD
    run_command( the_Command )
  File "/home/beck394/su2/mvapich/SU2/bin/SU2/run/interface.py", line 279, in run_command
    raise Exception , message
Exception: Path = /home/beck394/su2/TestCases/0_tutorial/navierstokes/flatplate/,
Command = mpirun -np 3 -machinefile hostfile /home/beck394/su2/mvapich/SU2/bin/SU2_CFD config_CFD.cfg
SU2 process returned error '134'
CSysSolve::modGramSchmidt: w[i+1] = NaN
CSysSolve::modGramSchmidt: w[i+1] = NaN
CSysSolve::modGramSchmidt: w[i+1] = NaN
terminate called after throwing an instance of 'int'
terminate called after throwing an instance of 'int'
terminate called after throwing an instance of 'int'
[ccteam07:mpi_rank_0][error_sighandler] Caught error: Aborted (signal 6)
[ccteam07:mpi_rank_1][error_sighandler] Caught error: Aborted (signal 6)
[ccteam07:mpi_rank_2][error_sighandler] Caught error: Aborted (signal 6)

Also the test cases:
free_surface/channel
spectral_method

They terminated with an error like:
Code:
Traceback (most recent call last):
  File "/home/beck394/su2/mvapich/SU2/bin/parallel_computation.py", line 111, in <module>
    main()
  File "/home/beck394/su2/mvapich/SU2/bin/parallel_computation.py", line 56, in main
    options.divide_grid  )
  File "/home/beck394/su2/mvapich/SU2/bin/parallel_computation.py", line 88, in parallel_computation
    info = SU2.run.CFD(config)
  File "/home/beck394/su2/mvapich/SU2/bin/SU2/run/interface.py", line 93, in CFD
    run_command( the_Command )
  File "/home/beck394/su2/mvapich/SU2/bin/SU2/run/interface.py", line 279, in run_command
    raise Exception , message
Exception: Path = /home/beck394/su2/TestCases/0_tutorial/free_surface/channel/,
Command = mpirun -np 3 -machinefile hostfile /home/beck394/su2/mvapich/SU2/bin/SU2_CFD config_CFD.cfg
SU2 process returned error '6'
CSysSolve::modGramSchmidt: dotProd(w[i+1],w[i+1]) < 0.0
CSysSolve::modGramSchmidt: dotProd(w[i+1],w[i+1]) < 0.0
CSysSolve::modGramSchmidt: dotProd(w[i+1],w[i+1]) < 0.0
terminate called after throwing an instance of 'int'
terminate called after throwing an instance of 'int'
terminate called after throwing an instance of 'int'
[ccteam08:mpi_rank_0][error_sighandler] Caught error: Aborted (signal 6)
[ccteam08:mpi_rank_1][error_sighandler] Caught error: Aborted (signal 6)
[ccteam08:mpi_rank_2][error_sighandler] Caught error: Aborted (signal 6)

I ran all the testcases with parallel_computation.py.
Some of them only crash down with specific number of process; some of them cannot be run at all.

Did I run the testcases with wrong python file?
Or there are other reasons that cause this kind of error...?


Thanks for ur help!


Becky
b614910 is offline   Reply With Quote

Old   April 7, 2014, 17:49
Default
  #2
Senior Member
 
Pay D.
Join Date: Aug 2011
Posts: 166
Blog Entries: 1
Rep Power: 15
pdp.aero is on a distinguished road
What is your version?
Could you run in serial without any problem?
Did you export you environmental variables after your installation?
I mean :
"export SU2_RUN="/usr/local/bin"
"export SU2_HOME="/home/your/trunk/path"
pdp.aero is offline   Reply With Quote

Old   April 7, 2014, 18:02
Default
  #3
New Member
 
Andrw Wendorff
Join Date: Apr 2014
Posts: 28
Rep Power: 12
awendorff is on a distinguished road
This seems to be a problem that was occurring in version 2.0. Check what version you are and if possible, update to the most current version from Github.

Andrew
awendorff is offline   Reply With Quote

Old   April 8, 2014, 11:06
Default
  #4
New Member
 
Becky
Join Date: Mar 2014
Posts: 6
Rep Power: 12
b614910 is on a distinguished road
Thanks for ur reply, Payam & Andrew.

I installed ver 3.0.0 "eagle", and I have exported environment variables SU2_RUN & SU2_HOME as well.
But the error still existed.

As the testcases I mentioned before,
only free_surface/channel cannot be run in serial (SU2_CFD) & parallel.
Other cases run in serial without the error.

However in parallel, with specific num of process, the error occurred.
For example in testcase navierstokes/flatplate, it finished without any problem with np=15, but with np=14, it crashed down.


I am not sure whether the error is related to the mesh file I used ... ?


Becky
b614910 is offline   Reply With Quote

Old   April 8, 2014, 20:51
Default
  #5
Senior Member
 
Pay D.
Join Date: Aug 2011
Posts: 166
Blog Entries: 1
Rep Power: 15
pdp.aero is on a distinguished road
Quote:
Originally Posted by b614910 View Post
Thanks for ur reply, Payam & Andrew.

I installed ver 3.0.0 "eagle", and I have exported environment variables SU2_RUN & SU2_HOME as well.
But the error still existed.

As the testcases I mentioned before,
only free_surface/channel cannot be run in serial (SU2_CFD) & parallel.
Other cases run in serial without the error.

However in parallel, with specific num of process, the error occurred.
For example in testcase navierstokes/flatplate, it finished without any problem with np=15, but with np=14, it crashed down.
You are welcome. Okay.
The channel test case may work properly in serial like other test cases. Did you check your config file with the tutorial file (su2_manual.pdf). In some cases what you have had in test cases' config file doesn't match completely with the specific test cases, so it may not converge at first, and you need to check your config file in which your methods are indicated with the tutorial to make sure that your numerical methods are right for your simulation (I tried to bring my config file setting here from what I had done already for channel test case. You may find it useful). I attach my config file too.

Flow condition:

 Inlet Stagnation Temperature = 288.6 K
 Inlet Stagnation Pressure = 102010.0 N/m2
 Inlet Flow Direction, unit vector (x,y,z) = (1.0, 0.0, 0.0)
 Outlet Static Pressure = 101300.0 N/m2
 Resulting Mach number = 0.1

Problem definition:

Compressible Euler's equations.
Mach number: 0.1.
Angle of attack (AoA): 0 deg, and angle of sideslip (AoS): 0 deg.
Surface(s) where the force coefficients are to be evaluated: upper_wall, lower_wall.
The reference length/area (force coefficient) is 1.
The reference length (moment computation) is 1.
Reference origin (moment computation) is (0.25, 0, 0)

Spatial discretization:

Jameson-Schmidt-Turkel scheme for the flow inviscid terms.
JST viscous coefficients (1st, 2nd & 4th): 0.15, 0.5, 0.02.
The method includes a grid stretching correction (p = 0.3)
Piecewise constant integration of the flow source terms.
Gradient Computation using weighted Least-Squares method.

Time discretization:

Local time stepping (steady state simulation)
Euler implicit method for the flow equations.
A LU - symmetric Gauss-Seidel iteration is used for solving the linear system.
W Multigrid Cycle, with 3 multigrid levels.
Reduction of the CFL coefficient in the coarse levels: 0.9.
Max. number of children in the agglomeration stage: 250.
Max. length of an agglom. elem. (compared with the domain): 0.1.
Damping factor for the residual restriction: 0.9.
Damping factor for the correction prolongation: 0.9.
CFL ramp definition. factor: 1.2, every 100 iterations, with a limit of 1.
Multigrid Level: 0 1 2 3
Courant-Friedrichs-Lewy number: 5 4.5 4.05 3.65
MG PreSmooth coefficients: 1 2 3 3
MG PostSmooth coefficients: 0 0 0 0
MG CorrecSmooth coefficients: 0 0 0 0

Convergence criteria:

Maximum number of iterations: 999999.
Reduce the density residual 6 orders of magnitude.
The minimum bound for the density residual is 10e-12
Start convergence criteria at iteration 10.

Numerical Methods:

Numerical method for spatial gradients (GREEN_GAUSS, WEIGHTED_LEAST_SQUARES)
NUM_METHOD_GRAD= WEIGHTED_LEAST_SQUARES
Courant-Friedrichs-Lewy condition of the finest grid
CFL_NUMBER= 5.0
CFL ramp (factor, number of iterations, CFL limit)
CFL_RAMP= ( 1.2, 100, 1.0 )
Runge-Kutta alpha coefficients
RK_ALPHA_COEFF= ( 0.66667, 0.66667, 1.000000 )
Runge-Kutta beta coefficients
RK_BETA_COEFF= ( 1.00000, 0.00000, 0.00000 )
Number of total iterations
EXT_ITER= 999999
Linear solver for the implicit formulation (LU_SGS, SYM_GAUSS_SEIDEL, BCGSTAB)
LINEAR_SOLVER= LU_SGS
Min error of the linear solver for the implicit formulation
LINEAR_SOLVER_ERROR= 1E-6
Max number of iterations of the linear solver for the implicit formulation
LINEAR_SOLVER_ITER= 5

Multi-grid:

Multi-Grid Levels (0 = no multi-grid)
MGLEVEL= 3
 Multi-Grid Cycle (0 = V cycle, 1 = W Cycle)
MGCYCLE= 1
CFL reduction factor on the coarse levels
MG_CFL_REDUCTION= 0.9
Maximum number of children in the agglomeration stage
MAX_CHILDREN= 250
Maximum length of an agglomerated element (relative to the domain)
MAX_DIMENSION= 0.1
Multigrid pre-smoothing level
MG_PRE_SMOOTH= ( 1, 2, 3, 3 )
Multigrid post-smoothing level
MG_POST_SMOOTH= ( 0, 0, 0, 0)
Jacobi implicit smoothing of the correction
MG_CORRECTION_SMOOTH= ( 0, 0, 0, 0 )
Damping factor for the residual restriction
MG_DAMP_RESTRICTION= 0.9
Damping factor for the correction prolongation
MG_DAMP_PROLONGATION= 0.9
Full Multigrid (NO, YES)
FULLMG= NO
Start up iterations using the fine grid
START_UP_ITER= 0

Numerical Methods:

Convective numerical method (JST, LAX-FRIEDRICH, ROE-1ST_ORDER ,ROE-2ND_ORDER, AUSM-1ST_ORDER, AUSM-2ND_ORDER ,HLLC-1ST_ORDER, HLLC-2ND_ORDER)
CONV_NUM_METHOD_FLOW= JST
Slope limiter (NONE, VENKATAKRISHNAN, BARTH)
SLOPE_LIMITER_FLOW= NONE
Coefficient for the limiter (smooth regions)
LIMITER_COEFF= 0.3
1st, 2nd and 4th order artificial dissipation coefficients
AD_COEFF_FLOW= ( 0.15, 0.5, 0.02 )
Viscous numerical method (AVG_GRAD, AVG_GRAD_CORRECTED, GALERKIN)
VISC_NUM_METHOD_FLOW= AVG_GRAD_CORRECTED
Source term numerical method (PIECEWISE_CONSTANT)
SOUR_NUM_METHOD_FLOW= PIECEWISE_CONSTANT
Time discretization (RUNGE-KUTTA_EXPLICIT, EULER_IMPLICIT, EULER_EXPLICIT)
TIME_DISCRE_FLOW= EULER_IMPLICIT

What I understood from your Traceback message is you may have some problem with the mpi compiler, it is my guess again. Therefore, I think you are able to run the code in serial for all possible cases.

Following this further, what kind of mpi compiler did you use when you configured the code?
--with-mpi=/usr/local/bin/(mpicxx?)

Did you compile the code with CGNS? and also what version of metis did you use for configuring the code?
Attached Files
File Type: txt inv_channel.txt (9.0 KB, 3 views)

Last edited by pdp.aero; April 9, 2014 at 05:24.
pdp.aero is offline   Reply With Quote

Old   April 9, 2014, 10:46
Default
  #6
New Member
 
Becky
Join Date: Mar 2014
Posts: 6
Rep Power: 12
b614910 is on a distinguished road
Payam,

Actually the testcases I used are downloaded from the website, and I didn't modify them :\
But I will still check my config file to see if it used the right method, thanks for mentioning this point!

I compile SU2 with MVAPICH 2-1.9.
I have tried metis 5.0.3 and metis5.1.0, but the error didn't disappear.
And I didn't compile SU2 with CGNS, is it necessary?
(I learn from tutorial that CGNS is for creating meshes, I think I won't use it so I just skip installing it ..)


Becky
b614910 is offline   Reply With Quote

Old   April 9, 2014, 12:44
Default
  #7
Senior Member
 
Pay D.
Join Date: Aug 2011
Posts: 166
Blog Entries: 1
Rep Power: 15
pdp.aero is on a distinguished road
Quote:
Originally Posted by b614910 View Post
Payam,

Actually the testcases I used are downloaded from the website, and I didn't modify them :\
But I will still check my config file to see if it used the right method, thanks for mentioning this point!

I compile SU2 with MVAPICH 2-1.9.
I have tried metis 5.0.3 and metis5.1.0, but the error didn't disappear.
And I didn't compile SU2 with CGNS, is it necessary?
(I learn from tutorial that CGNS is for creating meshes, I think I won't use it so I just skip installing it ..)


Becky
Personally I goes for metis-4.0.3 without any problem for domain decomposition.
No, if you don't need CGNS for converting your grid format to the native SU2 format, it is not needed to be used for configuring the code.
Here is the packages that you may need for configuring the code in parallel. For every package I indicated what version I had used.
- OpenMPI or MPICH (3.0.4) plus tcl/tk (8.6.0)
- Metis (4.0.3)
- Python (2.6.6)
- numpy (1.6) plus ATLAS, BLAS and LAPACK
- Scipy (0.11.0) plus nose(1.3.0)
If you are running the code in serial without any problem; However you are experiencing some problem in parallel, try to configure it again, but make sure that you have all the package that you need for configuring the code in parallel.
b614910 likes this.

Last edited by pdp.aero; April 16, 2014 at 04:49.
pdp.aero is offline   Reply With Quote

Old   April 9, 2014, 21:12
Default
  #8
New Member
 
emily
Join Date: Mar 2014
Posts: 15
Rep Power: 12
Emily1412 is on a distinguished road
Quote:
Originally Posted by pdp.aero View Post
Personally I goes for metis-4.0.3 without any problem for domain decomposition.
No, if you don't need CGNS for converting your grid format to the native SU2 format, it is not needed to be used for configuring the code.
Here is the packages that you may need for configuring the code in parallel. For every package I indicated what version I had used.
- OpenMPI or MPICH (3.0.4) plus tcl/tk (8.6.0)
- Metis (4.0.3)
- Python (2.6.6)
- numpy (1.6) plus ATLAS, Lapack, Blac
- Scipy (0.11.0) plus nose(1.3.0)
If you are running the code in serial without any problem; However you are experiencing some problem in parallel, try to configure it again, but make sure that you have all the package that you need for configuring the code in parallel.
The same errors occurred when I run SU2 with MPI. Do you mean that the errors has some to do with the versions of the packages? I used mpiicpc and python (2.7) and other package versions are the same to yours. But I still have these errors. Sometimes,I change the configure file and the errors disappear. Most times, however, the errors are still there with the correct configure file.It seems to be related to the number of progresses (i.e. -p). With different np, it crashed down after different iteration times. Above all, the errors still occurred when I run on a cluster (2 nodes).
Hope for your reply. Thanks in advance!
Emily1412 is offline   Reply With Quote

Old   April 10, 2014, 04:27
Default
  #9
Senior Member
 
Pay D.
Join Date: Aug 2011
Posts: 166
Blog Entries: 1
Rep Power: 15
pdp.aero is on a distinguished road
Quote:
Originally Posted by Emily1412 View Post
The same errors occurred when I run SU2 with MPI. Do you mean that the errors has some to do with the versions of the packages? I used mpiicpc and python (2.7) and other package versions are the same to yours. But I still have these errors. Sometimes,I change the configure file and the errors disappear. Most times, however, the errors are still there with the correct configure file.It seems to be related to the number of progresses (i.e. -p). With different np, it crashed down after different iteration times. Above all, the errors still occurred when I run on a cluster (2 nodes).
Hope for your reply. Thanks in advance!
Hello Emily

Here is what I believe, if you manage to configure the code correctly in parallel, then the code performance wouldn't change test case to test case otherwise some specific issue have been indicated by it's developers. Therefore, If the code are working for one test case in parallel correctly, I believe that it will work for other test cases too.

If you are confronting such a problem for running the exemplified test cases in the tutorial after your installation, you need to check your configuration setting for installing the code in parallel and make sure that your packages in your system are installed correctly before installing the code.

Personally, I couldn't work it out with Python 2.7, so I used 2.6.6 as the tutorial said.

Generally, if a solver works in parallel for a test case and converged, it will works for different number of processors too. Probably, you are having some problems with metis for domain decomposition or mpi compiler. My suggestion to you is rebuild the code and make sure that all the required packages are installed correctly. Don't forget to clean the software before rebuilding (python ./build_su2.py -o redhat -c)
Finally, set your environment variable permanently to your bachrc. (sudo gedit ~/.bashrc)
pdp.aero is offline   Reply With Quote

Old   April 10, 2014, 05:47
Default
  #10
New Member
 
emily
Join Date: Mar 2014
Posts: 15
Rep Power: 12
Emily1412 is on a distinguished road
Thank you so much for your advice!!Could you please help me with the errors that occurred when I run Tutorial 2 on 2 nodes?? Thanks in advance!
The details are as follows:
http://www.cfd-online.com/Forums/su2...n-2-nodes.html
Emily1412 is offline   Reply With Quote

Old   July 27, 2014, 23:16
Default
  #11
Senior Member
 
Pay D.
Join Date: Aug 2011
Posts: 166
Blog Entries: 1
Rep Power: 15
pdp.aero is on a distinguished road
Hi,

Sorry for my delayed reply, I know its too late. However, If you still have you problem, or even interesting in using the SU2 in parallel, have a look here.

http://www.cfd-online.com/Forums/blo...-part-1-3.html

PDP
pdp.aero is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Problem with pow and volScalarField _Stefan_ OpenFOAM Programming & Development 15 April 14, 2019 23:20
OpenFOAM-1.5-dev svn revision 1438: libOpenFOAM does not compile in SP 4xF OpenFOAM Bugs 3 October 16, 2009 06:35
Error with Wmake skabilan OpenFOAM Installation 3 July 28, 2009 01:35
reconstructParMesh not working with an axisymetric case francesco OpenFOAM Bugs 4 May 8, 2009 06:49
Statically Compiling OpenFOAM Issues herzfeldd OpenFOAM Installation 21 January 6, 2009 10:38


All times are GMT -4. The time now is 17:28.