|
[Sponsors] |
[snappyHexMesh] CPU high usage after finished running SHM in parallel |
|
LinkBack | Thread Tools | Search this Thread | Display Modes |
November 6, 2017, 13:15 |
CPU high usage after finished running SHM in parallel
|
#1 |
New Member
Calum Douglas
Join Date: Apr 2013
Location: Coventry, UK
Posts: 26
Rep Power: 13 |
Recently upgraded to OF5x and Ubuntu 17.1.
Made quite a few mods to my OF2.4 cases to get everything working, but I sometimes get very strange behaviour from Snappy, in that I will be left with an EXTREMELY slow PC after snappy finishes meshing (even if the mesh is totally fine). This does not happen every time I use it, but I`d say, 75% of the time. Means I have to reset my PC after each mesh. This does not happen with the motorbike tutorial case in OF5, which suggests I have missed some final conversion changes when I ported my OF2.4 cases over. I do not think that I have made any errors changing the actual files over, I think it is probably in my usage of them as I make my own scripts to run this stuff, and suspect something has changed from OF2.4 to OF5 that is making my script do something unexpected. My script below: anyone notice anything obviously amiss from 2.4 > 5.0 usage ? Code:
#!/bin/sh cd ${0%/*} || exit 1 # run from this directory DONT USE IF RUNNING FROM CONSOLE LINE BY LINE - CALUM # Source tutorial run functions . $WM_PROJECT_DIR/bin/tools/RunFunctions rmdir -v 0 # delete the 0 folder mkdir 0 surfaceFeatureExtract blockMesh transformPoints -scale '(1000 1000 1000)' #Makes the blockmesh scaled up 1000 times for meshing decomposePar -copyZero mpirun -np 8 snappyHexMesh -overwrite -parallel >> snappyHexMesh.log mpirun -n 8 renumberMesh -overwrite -parallel reconstructParMesh -constant # Clearing out redundant data foamListTimes -rm rm -rf processor* rm -f log.* rm -f postProcessing/residuals/0/residuals.dat transformPoints -scale '(0.001 0.001 0.001)' #Scales back mesh correctly for solver # ----------------------------------------------------------------- end-of-file
__________________
Calum Douglas Director Scorpion Dynamics Ltd Email: calum.douglas@scorpion-dynamics.com Web: www.scorpion-dynamics.com |
|
November 10, 2017, 11:08 |
update
|
#2 |
New Member
Calum Douglas
Join Date: Apr 2013
Location: Coventry, UK
Posts: 26
Rep Power: 13 |
This just happened today after running SimpleFOAM in single-threaded mode.
PC almost totally unresponsive after the simulation ended. Seems like this sort of thing has also happened to others: https://www.linuxquestions.org/quest...ad-4175611010/ I have tried Ubuntu 16.04 and doing a FULL wipe and Ubuntu 17.1 - I have tried Openfoam 5 and Openfoam-Dev. I cannot possibly be the only one on this forum who has noticed this problem, and I`m deeply un convinced its any sort of hardware fault as I have windows 7 on the same PC and have run a lot of intensive software on it. No changes whatsoever in Windows-world. This was also after running snappyhexmesh, in non parallel mode. I`m going to try to see if its happening with cases without snappy at all.
__________________
Calum Douglas Director Scorpion Dynamics Ltd Email: calum.douglas@scorpion-dynamics.com Web: www.scorpion-dynamics.com |
|
November 10, 2017, 16:02 |
solution
|
#3 |
New Member
Calum Douglas
Join Date: Apr 2013
Location: Coventry, UK
Posts: 26
Rep Power: 13 |
not 100% confirmed YET - but I suspect the problem was the scaling x1000 of the domains; which a few years ago was a bit of a trick to try to get snappy to make slightly better surface snapping and ensure no reverse orientated STL normals.
Got rid of that operation from the scripts - and so far seems to be ok.
__________________
Calum Douglas Director Scorpion Dynamics Ltd Email: calum.douglas@scorpion-dynamics.com Web: www.scorpion-dynamics.com |
|
November 12, 2017, 03:17 |
|
#4 |
Senior Member
Francesco Del Citto
Join Date: Mar 2009
Location: Zürich Area, Switzerland
Posts: 237
Rep Power: 18 |
Hi,
Actually, we had a similar problem. Running a simulation twice, the second time was running at half the speed as the first time. Not always, but it happened. I do not understand how this is possible, as when a process is finished, it does not use any cpu resource, as you can see from a top, but the second run was slower. I’ll try to do some more tests to see if the problem appears again. I doubt that the mesh scaling is the problem, however, it should be something else, but I have no idea what. Just a question, were you running in parallel? Francesco Last edited by fra76; November 12, 2017 at 03:25. Reason: Typo |
|
November 12, 2017, 18:59 |
|
#5 |
New Member
Calum Douglas
Join Date: Apr 2013
Location: Coventry, UK
Posts: 26
Rep Power: 13 |
The problem is BACK !
I also had a thought it might be the decomposition method - but scotch produces the same behaviour. yes its whilst running in parallel. Can this be an MPI bug ? I cant remember - does runParallel command use MPI ? Another possibilty - which I`m trying to isolate is that it might be when snappy reaches the cell limits defined in the snappy dict. But I need to do some trials to confirm...this WOULD explain why it "appears" to be slightly randomised. Nope.... not the cell-count limit.... This is essentially making OpenFOAM unusable...
__________________
Calum Douglas Director Scorpion Dynamics Ltd Email: calum.douglas@scorpion-dynamics.com Web: www.scorpion-dynamics.com Last edited by snowygrouch; November 12, 2017 at 20:16. |
|
November 21, 2017, 18:40 |
update
|
#6 |
New Member
Calum Douglas
Join Date: Apr 2013
Location: Coventry, UK
Posts: 26
Rep Power: 13 |
I believe I have solved this - I have set "Swappiness" from 60 to 10 in Ubuntu 16.04LTS.
This basically nearly turns off any use of swap, which is apparently set unusually high as standard in Ubuntu. So far the results are encouraging.
__________________
Calum Douglas Director Scorpion Dynamics Ltd Email: calum.douglas@scorpion-dynamics.com Web: www.scorpion-dynamics.com |
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Running parallel case after parallel meshing with snappyHexMesh? | Adam Persson | OpenFOAM Running, Solving & CFD | 0 | August 31, 2015 23:04 |
parallel running | student666 | OpenFOAM Running, Solving & CFD | 7 | May 21, 2014 15:55 |
Something weird encountered when running OpenFOAM in parallel on multiple nodes | xpqiu | OpenFOAM Running, Solving & CFD | 2 | May 2, 2013 05:59 |
parallel Grief: BoundaryFields ok in single CPU but NOT in Parallel | JR22 | OpenFOAM Running, Solving & CFD | 2 | April 19, 2013 17:49 |
CFX-5.7.1(Linux) Parallel - 4 CPU Machine | James Date | CFX | 6 | June 14, 2005 19:03 |