performance engineer’s guide to hotspot JIT compilation – monica beckwith – qcon

For more QCon posts, see my live blog table of contents. This presentation is about the compiler and also the runtime.

Major pieces

  • Execution engine
    • Heap management/garbage collection
    • JIT compilation
  • Runtime
    • VM Class loading
    • Interpretter
    • Byte code verification,etc

Runtime goal – convert from bytecode to native code and do optimizations along the way

Compilation Techniques and Notes

  • Pre-compiled/ahead of time
  • profile guided – based on critical hotspots
  • Adaptive optimization (Java uses Profile guided and Adaptive optimization)
  • Identify root of compilation
  • replace method or on stack – depends on number of times through loop
  • Server compiler has a higher threshold than client compiler for the threshold at which you need optimizations
  • Tiered compilation – tier 1 is client compiler with no profiling info, tier 2 and 3 are client compiler with profiling info. Then comes server compiler
  • CodeCache order of magnitude larger when tiered compilation is enabled. If need more can use -XX:ReservedCodeCacheSize
  • Inlining – many different parameters when figuring out when to inline
  • Vectorization – SIMD (SIngle Instruction Multiple Data). Can generate stubs and benefit from caching size chunks. For SuperWord Level Parallelism, you need to unroll the loop, do analysis/pre-optimization, etc. Still in infancy with Hotspot.
  • Escape analysis – Want to see if object only is used in a compile method. Need entire graph to confirm not in a static field/returned from method/passed as parameter/etc. If really local, can optimize by storing in registers.
  • Objects are 8 byte aligned by default. Fields are aligned by type.
  • OOP (ordinary object pointer) is a managed pointer. The size can be changed to optimize.
  • Compressed Class Pointers – part of the Metaspace. Class data is outside of heap.


  1. dependency issues
  2. class unloading/redefinition
  3. uncommon path
  4. profiled info isn’t useful for path [like with databases when db assumes something different than you want]

If curious about details

To get information about what compiler thinks/did:

  • PrintCompilation – ex: what level instructions were compiled at
  • PrintInlining – use -XX:+UnlockDiagonsticVMOptions

Leave a Reply

Your email address will not be published. Required fields are marked *