A quick tip from a battle this morning… I’m still hunting down the memory leak that was the origin of my guide on finding memory leaks in the JVM a couple of weeks ago. I’m trying to analyze a heap dump using the Eclipse Memory Analysis from our ColdFusion 8 server which uses 2GB of RAM on Linux. Most of the tutorials about “large heaps” are talking about 500MB. That’s not the issue here; we run our server to use 2GB of RAM for the heap.
The problem comes when you try to analyze a heap dump taken from a java.lang.OutOfMemory error or when the heap is close to max size. With the Old Gen at 100%, I generated a 2.3GB heap dump. MAT (or really, Eclipse and Java on 32-bit Windows) can’t analyze a heap that big – it runs out of heap space and crashes.
The first thing I tried was adjusting the -Xmx parameter for MAT by editing the MemoryAnalyzer.ini file:
I tried all kinds of numbers – 1350MB was the biggest I could get (this is on Windows XP Pro; if I had 64-bit this would be a non-issue but on 32-bit Windows you can only address roughly 1.3-1.7GB of ram in a contiguous instance like the JVM needs) with Sun JVM 1.6.0_03 and _10. So… I happen to have Jrockit on my desk, also from my previous post on profiling the heap in real-time, so I thought I would see how high I could push it. Through a bit of trial and error with the -Xmx parameter, I wound up at a MemoryAnalyzer.ini that allowed me to process the heap dump successfully:
-vm C:/Program Files/Java/jrmc-3.1.0-1.6.0/jre/bin/jrockit/jvm.dll -vmargs -Xmx1700m
And this worked for me! Note, there is also a /3GB switch in Windows XP that may have allowed me to push the heap size larger but the above worked, so I stuck with it. Research the above link to determine if it might help you.