CFD Online Logo CFD Online URL
www.cfd-online.com
[Sponsors]
Home > Forums > Software User Forums > OpenFOAM > OpenFOAM Bugs

ParaView fills my RAM

Register Blogs Community New Posts Updated Threads Search

Reply
 
LinkBack Thread Tools Search this Thread Display Modes
Old   December 7, 2012, 04:57
Default ParaView fills my RAM
  #1
Senior Member
 
Pablo Higuera
Join Date: Jan 2011
Location: Auckland
Posts: 627
Rep Power: 19
Phicau is on a distinguished road
Hi all,

it is well known that ParaView fills up your RAM little by little while you are exporting a video (even snapshot by snapshot), but this other thing I'm reporting is new for me...

I am running ParaView on a cluster, exporting the X server, so no RAM should be being used in my computer. However while I am exporting a video my RAM is slowly being used. The process Xorg is taking it. In the end my computer (8GB) starts to swap and becomes very slow.

It does not matter if ParaView window is on the screen, if it is hidden behind other windows or if it is minimized, I end up having nearly 30 snapshots exported and my computer needing to be restarted. ParaView is only taking 16GB on the cluster, which has 64GB available.

A little technical info:

Cluster running RHL 6, OpenFOAM 2.1.1, ParaView 3.10.0. OF & PV 64 bits, installation using CentFOAM project.

Local computer: Running Ubuntu 10.04, 64 bits, fully updated.

Did anyone encounter the same problem? Any tentative solutions?

Thanks!
Phicau is offline   Reply With Quote

Old   December 7, 2012, 07:32
Default
  #2
Member
 
Join Date: Nov 2012
Posts: 58
Rep Power: 14
startingWithCFD is on a distinguished road
Same thing happens with me.

Cluster
=====
SUSE Linux Enterprise Server 11 SP1 (x86_64)
OpenFOAM v2.1.0
ParaView v3.12.0
Installation from sources.

Local computer
===========
OpenSUSE 11.4 32 bits
Fully updated (I think).


EDIT: I usually end up with more snapshots, it depends on the grid size clearly.
But since I am interested in plotting only a specific patch of my mesh, I load only that in memory and work around the limitation.
startingWithCFD is offline   Reply With Quote

Old   December 7, 2012, 14:55
Default
  #3
Senior Member
 
kmooney's Avatar
 
Kyle Mooney
Join Date: Jul 2009
Location: San Francisco, CA USA
Posts: 323
Rep Power: 18
kmooney is on a distinguished road
Download the newest release candidate from Kitware (3.98.0). I no longer had this issue once I updated. I used the out of the box binary as well.
kmooney is offline   Reply With Quote

Old   December 10, 2012, 04:18
Default
  #4
Senior Member
 
Pablo Higuera
Join Date: Jan 2011
Location: Auckland
Posts: 627
Rep Power: 19
Phicau is on a distinguished road
No issues for now with this version, thanks a lot!

However, it's 10 times slower, so I guess I will end up having to compile this version for my system.
Phicau is offline   Reply With Quote

Old   December 10, 2012, 12:04
Default
  #5
Senior Member
 
kmooney's Avatar
 
Kyle Mooney
Join Date: Jul 2009
Location: San Francisco, CA USA
Posts: 323
Rep Power: 18
kmooney is on a distinguished road
That's too bad about the performance hit. I was actually surprised that the vanilla binary worked as fast as it does. Perhaps I'll try to do my own compilation and see if it gets better.
kmooney is offline   Reply With Quote

Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
[General] Paraview launch & crash problems dancfd ParaView 3 January 17, 2013 13:04
[General] Paraview nice animation PyGloo ParaView 4 June 7, 2012 13:34
Newbie: Install ParaView 3.81 on OF-1.6-ext/OpenSuse 11.2? lentschi OpenFOAM Installation 1 March 9, 2011 03:32
Paraview not found fusij OpenFOAM Installation 2 January 1, 2011 21:44
paraFoam reader for OpenFOAM 1.6 smart OpenFOAM Installation 13 November 16, 2009 22:41


All times are GMT -4. The time now is 01:32.