Java

When Memory Leaks Attack

It seems Tomcat and Java are gushing memory on us. I’ve been kicking the tires on YourKit Java Profiler, which thus far has been running without incident on one of our production servers. Over the past 48 hours, the memory usage has grown by 34 MB, which is consistent with past evidence of memory leaks in our system. (Soundtrack for this post: “When Animals Attack” – Institute)

The results are fairly interesting – the number one culprit is org.xbill.DNS.*, which accounts for over half of the growth. Apparently Tomcat 3.3 (we’re upgrading to 5.5 soon) has DNS lookups and caching – that cache seems to be unlimited in size, with no expiration, and cannot be turned off [in 3.3]. So in an application that receives callbacks from high-traffic corporate web sites, with a broad range if inbound IP addresses, this becomes a large memory leak.

This is why JVMTI is such a leap over JVMPI – my first experience with a Java profiler was JProbe, and my main impression of it was how tediously slow it was. With JVMTI agents, our app runs pretty much at full speed, and can be profiled on demand. Finding this leak in a development or QA environment with a JVMPI profiler would be nearly impossible, because you just don’t tend to test with thousands of distinct IP addresses in QA.

The other interesting leak is a JVM Bug that will be fixed in Java 6. It seems that ObjectInputStream uses a SoftCache that results in retained references to classes that have been serialized. This explains the wealth of new HashMap$Entry and byte[] objects that are sitting in memory.

In a way, the results are encouraging – over the past 48 hours, running at high volume, the memory used by classes we’ve written has actually dropped by 150 kb. So the up side is that the memory growth isn’t our fault. The down side is that it’s much harder for us to resolve the issues since they’re not in our code.

Advertisement