Performance and scalability: Analysing large HPROF file

  1. Analysing large HPROF file (4 messages)

    I have an HPROF file of 1,2Gb which I'ld like to analyse but all tools end with an OutOfMemory error ... I tried JHat and the IBM HeapAnalyser, without success. Is there a way to analyse such large hprof files?
  2. 1) Just a basic question first: Have you tried modifying the jhat start up script to include a -Xmx arg to allow it to use more memory? Say '-Xmx2048m' http://java.sun.com/j2se/1.4.2/docs/tooldocs/windows/java.html 2) If that doesn't work, perhaps you can try Visual VM: https://visualvm.dev.java.net/ Documentation on importing heap dumps: https://visualvm.dev.java.net/heapdump.html Good luck, - Andy
  3. I have used the SAP Memory Analyzer successfully with heap dumps over 1 GB: https://www.sdn.sap.com/irj/sdn/wiki?path=/display/Java/Java+Memory+Analysis Also, you could try the custom jhat.jar on Edward Chou's blog: http://blogs.sun.com/edwardchou/entry/javaone_bof_on_memory_leaks Charlie http://www.performanceengineer.com/
  4. The Heap Inspector created by SAP donated to the Eclipse Foundation is 100% better then most other HPROF file reading applications out there. It has great speed and isn't a memory hog. Check out, http://www.eclipse.org/proposals/memory-analyzer/ and http://www.eclipse.org/mat/
  5. Re: Analysing large HPROF file[ Go to top ]

    Please try SAP HeapAnalyzer