|
[Sponsors] |
parallel simulations - error message: "OPT_ITERATIONS: invalid option name" |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
June 23, 2015, 01:27 |
parallel simulations - error message: "OPT_ITERATIONS: invalid option name"
|
#1 |
New Member
Muhamad Fakhrusy
Join Date: May 2015
Posts: 8
Rep Power: 11 |
hi,
I am trying to do parallel simulations using parallel_simulations.py then I encountered this error c:\SU2parallel>"C:\python27\python" c:\su2parallel\su2-master\su2_py\parallel_co mputation.py -f inv_WING.cfg -n 12 OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name OPT_ITERATIONS: invalid option name OPT_ACCURACY: invalid option name BOUND_DV: invalid option name job aborted: [ranks] message [0-11] process exited without calling finalize ---- error analysis ----- [0-11] on DELL-PC C:\SU2parallel\su2-master\su2_py\SU2_CFD ended prematurely and may have crashed. exit code 1 ---- error analysis ----- SU2 process was terminated by signal '3' I don't have those 3 options on my configuration file inv_WING.cfg I have tried to add those 3 options to the config file and still encountered the error. please help about what did I do wrong with the simulation, thanks I try make another cfg file but still encounter the same error, I checked that I delete those 3 options already. I attach my cfg file in zip (wing2.cfg) wing2.zip glad if you can help me ======================= edit, I try to open the config.py and found this: #hack - twl if not data_dict.has_key('DV_VALUE_NEW'): data_dict['DV_VALUE_NEW'] = [0] if not data_dict.has_key('DV_VALUE_OLD'): data_dict['DV_VALUE_OLD'] = [0] if not data_dict.has_key('OPT_ITERATIONS'): data_dict['OPT_ITERATIONS'] = 100 if not data_dict.has_key('OPT_ACCURACY'): data_dict['OPT_ACCURACY'] = 1e-10 if not data_dict.has_key('BOUND_DV'): data_dict['BOUND_DV'] = 1e10 return data_dict I don't really understand python so I don't know what that means, but I try delete those things once and encounter another error: c:\SU2parallel\SU2-master\SU2_PY>"c:\python27\python" parallel_computation.py -f C:\su2parallel\WING2.cfg -n 12 cstr=WING_1.su2 cstr=WING_1.su2 cstr=WING_1.su2 cstr=WING_1.su2 There is no geometry file (GetnZone))! There is no geometry file (GetnZone))! There is no geometry file (GetnZone))! There is no geometry file (GetnZone))! cstr=WING_1.su2 There is no geometry file (GetnZone))! cstr=WING_1.su2 There is no geometry file (GetnZone))! cstr=WING_1.su2 There is no geometry file (GetnZone))! cstr=WING_1.su2 There is no geometry file (GetnZone))! cstr=WING_1.su2 There is no geometry file (GetnZone))! job aborted: [ranks] message [0] terminated [1] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 1 [2] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 2 [3] terminated [4] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 4 [5] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 5 [6] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 6 [7] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 7 [8] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 8 [9] terminated [10] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 10 [11] application aborted aborting MPI_COMM_WORLD (comm=0x44000000), error 1, comm rank 11 ---- error analysis ----- [1-2,4-8,10-11] on DELL-PC C:\SU2parallel\su2-master\su2_py\SU2_CFD aborted the job. abort code 1 ---- error analysis ----- Traceback (most recent call last): File "parallel_computation.py", line 110, in <module> main() File "parallel_computation.py", line 61, in main options.compute ) File "parallel_computation.py", line 88, in parallel_computation info = SU2.run.CFD(config) File "c:\SU2parallel\SU2-master\SU2_PY\SU2\run\interface.py", line 88, in CFD run_command( the_Command ) File "c:\SU2parallel\SU2-master\SU2_PY\SU2\run\interface.py", line 276, in run _command raise exception , message SU2.EvaluationFailure: Path = c:\SU2parallel\SU2-master\SU2_PY\, Command = mpiexec -n 12 C:\SU2parallel\su2-master\su2_py\SU2_CFD config_CFD.cfg SU2 process returned error '1' Last edited by v8areu; June 23, 2015 at 04:11. |
|
July 16, 2015, 16:08 |
|
#2 |
Senior Member
Heather Kline
Join Date: Jun 2013
Posts: 309
Rep Power: 14 |
Thanks for your question
The first thing to do is to try using one processor and the files from Quickstart or the TestCases. It looks like you are using a python script that does not match your SU2_* executables. I say this because it is looking for _#.su2 meshes, which were automatically generated in a previous version. Given the repeated lines of output, it also looks as though the code was not compiled for parallel computation - installation for parallel is covered at https://github.com/su2code/SU2/wiki/Parallel-Build. To check that the versions match, make sure that "which parallel_computation.py" points to the same bin/ directory as "which SU2_CFD", which will be the directory where you either installed from source or whichever directory you chose for downloaded executables. If you downloaded the pre-compiled version, make sure you also got the updated python scripts. Last edited by hlk; July 19, 2015 at 22:53. |
|
July 20, 2015, 19:26 |
|
#3 | |
New Member
Muhamad Fakhrusy
Join Date: May 2015
Posts: 8
Rep Power: 11 |
Quote:
to get it working before, I copied the mesh file and renamed the copied one into _1.su2 but it's not a convenient way because every case would take twice the space it should be, so I need to do something about that ... |
||
July 22, 2015, 01:47 |
|
#4 | |
New Member
Muhamad Fakhrusy
Join Date: May 2015
Posts: 8
Rep Power: 11 |
Quote:
I have met another problem again while running in parallel after I've done with the environment variables. it said like this: (I type on my command line: python "C:\SU2-4.0.0\SU2-4.0.0\SU2_P Y\parallel_computation.py" -f turb_SA_flatplate.cfg -n 2) I'm using Windows 8.1 and just installed the precompiled SU2-4.0.0 and using python script which included in the SU2-4.0.0 master directory that I've downloaded from github. I copied the SU2-4.0.0 master files into my installation folder so that explains the C:\SU2-4.0.0\SU2-4.0.0. the first one is the executables (su2_cfd.exe etc), the 2nd one is the scripts which I've been using to do parallel computation. W:\KP\tutorial\su2\TestCases\rans\flatplate>python "C:\SU2-4.0.0\SU2-4.0.0\SU2_P Y\parallel_computation.py" -f turb_SA_flatplate.cfg -n 2 Traceback (most recent call last): File "C:\SU2-4.0.0\SU2-4.0.0\SU2_PY\parallel_computation.py", line 110, in <mo dule> main() File "C:\SU2-4.0.0\SU2-4.0.0\SU2_PY\parallel_computation.py", line 61, in main options.compute ) File "C:\SU2-4.0.0\SU2-4.0.0\SU2_PY\parallel_computation.py", line 88, in para llel_computation info = SU2.run.CFD(config) File "C:\SU2-4.0.0\SU2-4.0.0\SU2_PY\SU2\run\interface.py", line 88, in CFD run_command( the_Command ) File "C:\SU2-4.0.0\SU2-4.0.0\SU2_PY\SU2\run\interface.py", line 276, in run_co mmand raise exception , message SU2.EvaluationFailure: Path = W:\KP\tutorial\su2\TestCases\rans\flatplate\, Command = mpirun -n 2 C:\SU2-4.0.0\SU2_CFD config_CFD.cfg SU2 process returned error '1' 'mpirun' is not recognized as an internal or external command, operable program or batch file. any thoughts? I do searched for mpirun throughout my system C: but couldn't find anything outside my Cygwin directory. I've added the Cygwin/bin into the environment variable already. any answer will be appreciated |
||
July 22, 2015, 08:24 |
|
#5 | |
Senior Member
Heather Kline
Join Date: Jun 2013
Posts: 309
Rep Power: 14 |
Quote:
|
||
July 23, 2015, 03:57 |
|
#6 |
New Member
Muhamad Fakhrusy
Join Date: May 2015
Posts: 8
Rep Power: 11 |
finally I could do the scripts after changing some things in the interface.py script.
SU2_RUN = os.environ['SU2_RUN'] sys.path.append( SU2_RUN ) # SU2 suite run command template base_Command = os.path.join(SU2_RUN,'%s') # check for slurm slurm_job = os.environ.has_key('SLURM_JOBID') # set mpi command if slurm_job: mpi_Command = 'srun -n %i %s' #elif not which('mpirun') is None: <<<<< # mpi_Command = 'mpirun -n %i %s' <<<<< elif not which('mpiexec') is None: mpi_Command = 'mpiexec -n %i %s' else: mpi_Command = '' I don't know why but those 2 lines seem to gave me the problem. I just add # to make them as a comment. I got mpiexec.exe on my MPI so I guess that's it. |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
sliding mesh problem in CFX | Saima | CFX | 46 | September 11, 2021 08:38 |
Error in Two phase (condensation) modeling | adilsyyed | CFX | 15 | June 24, 2015 20:42 |
Setting rotating frame of referece. | RPFigueiredo | CFX | 3 | October 28, 2014 05:59 |
Overflow Error in Multiphase Modelling with Two Continuous Fluids | ashtonJ | CFX | 6 | August 11, 2014 15:32 |
Constant velocity of the material | Sas | CFX | 15 | July 13, 2010 09:56 |