OutOfMemoryError - GC overhead limit exceeded
March 25, 2010
Someone asked me recently about the following exception on their ColdFusion server:
java.lang.OutOfMemoryError: GC overhead limit exceeded
This exception is thrown by the garbage collector (in the underlying jvm, it's not specific to ColdFusion), when it is spending way too much time collecting garbage. This error essentially means that you need to add more memory, or reconfigure your garbage collection arguments. You can suppress this error by adding
-XX:-UseGCOverheadLimit to your JVM startup arguments.
Here's what Sun has to say about it:
The parallel / concurrent collector will throw an
OutOfMemoryError if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown. This feature is designed to prevent applications from running for an extended period of time while making little or no progress because the heap is too small. If necessary, this feature can be disabled by adding the option
-XX:-UseGCOverheadLimit to the command line.
Trackback Address: 746/2DDC8D57F846F966ABF538EEB08FBB0C
Pete, Depending on the actual situation there are couple of other things that have worked for me. 1) Increase the max memory. The default CF install sets it to 512 MB, if you have CF standard you can bump this to 1024 MB, with enterprise you can go beyond that, e.g.: -Xmx4096m 2) Use a more agressive GC methodology. This has been a saver in several situation. Remove the -XX:+UseParallelGC directive and use the -XX:+UseConcMarkSweepGC instead.
I have also used other tweaks before including playing with the young/new/old ratios. Before going there profiling GC memory using open source tools such as GCViewer has been helpfull.
@Bilal Thanks, yes increasing the heap size should be the first step, if possible.
Thank you for this post. My hosting provider has been having this issue lately (I get nice error messages from BlogCFC hehe). I forwarded this link to them in hopes of a resolution.
I ran into "java.lang.OutOfMemoryError: GC overhead limit exceeded" on a CF 8.0.1 development server but occasionally the exception was "java.lang.OutOfMemoryError: Java heap space" instead. It turned out the problem was that Server Monitoring was enabled with Memory Tracking turned on.
This post I found has more details http://www.numtopia.com/terry/blog/archives/2007/06/coldfusion_8_monitoring_heisenberg_errors.cfm the solution in this post is: "In the CF administrator: Go to Server Monitoring Launch Server Monitor Up at the top there should be 3 options that say Stop Monitoring, Stop Profiling, Stop Memory Tracking. Turn them off. However, these are turned off by default ...."
Good fill someone in on and this fill someone in on helped me alot in my college assignement. Gratefulness you as your information.
The link to java.sun.com has now changed to:
RE: ...if you have CF standard you can bump this to 1024 MB
Argh! What?! I'm running into more and more issues like this and I'm really becoming disappointed with Adobe. It appears they have two versions of CF now -- developer and Enterprise because Standard is all but useless in a production environment.
@Steve that comment you are referring to is a bit outdated... ColdFusion 8 Standard did not support 64 bit, ColdFusion 9 standard does, and thus you can go above 1GB heap sizes. The limit on the heap size is due to the 32 bit architecture, and if you run Enterprise on 32 bit you run into the same limitations.
Whew! I did quite a bit of searching after I read that post and could not find any info that confirmed or denied the limit. It seems strange that this information is not more readily available in the specs or documentation.
@Steve I agree it is a bit hard to find this info... There is a technote here: http://kb2.adobe.com/cps/193/tn_19359.html and I wrote a blog entry on it back in 2004: http://www.petefreitag.com/item/140.cfm
Thaks, this really helped me, in solve my problem in ant compiling sources.