CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > SU2 > SU2 Installation

Parallel computation on windows 8

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   February 21, 2020, 14:20
Question Parallel computation on windows 8
  #1
New Member
 
serg
Join Date: Dec 2015
Posts: 29
Rep Power: 11
kont87 is on a distinguished road
Hello,
I googled it but could not find the answer. My problem is, when I use "parallel_computation.py" script, all cores run the whole mesh instead of a partitioned one, and each core performs each iteration (4 cores = 4 identical solution), but if I use mpiexec -n #cores SU2_CFD.exe [config file] instead, It works. I have no problem with that, but for the optimization tutorials, I could not solve the issue and running the optimization cases on a single core is pretty annoying.

I followed the steps -which is very straightforward- for a Win machine, I did not compile the code by using cygwin. I have the latest version of SU2, python version is 3.8.1, numpy's is 1.18.1, scipy's is 1.4.1

I have MS-MPI, version is 10.1.12498.18. SU2_RUN, path for MPI and python are set accordingly.

Any help is appreciated.
Regards.
S.
kont87 is offline   Reply With Quote

Old   February 22, 2020, 13:22
Default
  #2
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14
pcg is on a distinguished road
Do you have more than one mpi version in your system?
The optimization script might be calling mpirun instead (not sure what is the default).
If you run mpirun -n # SU2_CFD do you get the same problem?
pcg is offline   Reply With Quote

Old   February 22, 2020, 14:58
Default
  #3
New Member
 
serg
Join Date: Dec 2015
Posts: 29
Rep Power: 11
kont87 is on a distinguished road
Quote:
Originally Posted by pcg View Post
Do you have more than one mpi version in your system?
The optimization script might be calling mpirun instead (not sure what is the default).
If you run mpirun -n # SU2_CFD do you get the same problem?
It seems mpirun is not recognized in my system whereas mpiexec is, do you have a suggestion?
Thank you.
kont87 is offline   Reply With Quote

Old   February 26, 2020, 07:18
Default
  #4
pcg
Senior Member
 
Pedro Gomes
Join Date: Dec 2017
Posts: 466
Rep Power: 14
pcg is on a distinguished road
You can define a custom mpi command via environment variable SU2_MPI_COMMAND, this is used in $SU2_RUN/SU2/run/interface.py, if all else fails you can always hack that script.
pcg is offline   Reply With Quote

Old   February 27, 2020, 12:19
Default
  #5
New Member
 
serg
Join Date: Dec 2015
Posts: 29
Rep Power: 11
kont87 is on a distinguished road
Quote:
Originally Posted by pcg View Post
You can define a custom mpi command via environment variable SU2_MPI_COMMAND, this is used in $SU2_RUN/SU2/run/interface.py, if all else fails you can always hack that script.
Problem is solved, thank you for your help. I have now another issue, which is exactly as described in here:
https://github.com/su2code/SU2/issues/559

I am just going over the tutorials, and not changing anything in the config files, yet I can't detect the problem. Do you have any idea?
kont87 is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
The problem when i use parallel computation for mesh deforming. Hiroaki Sumikawa OpenFOAM Running, Solving & CFD 0 November 20, 2018 03:58
Error running openfoam in parallel fede32 OpenFOAM Programming & Development 5 October 4, 2018 17:38
Windows 7 parallel calculations curky SU2 Installation 23 May 14, 2013 03:17
problem in the CFX12.1 parallel computation BalanceChen ANSYS 2 July 7, 2011 11:26
about parallel computation akang CFX 5 June 22, 2005 11:51


All times are GMT -4. The time now is 09:16.