jvm death match – live blogging from qcon

JVM Death Match
Speakers:
Daniel Heidinga – IBM
Gil Tene – Azul
Thomas Wuerthinger – Oracle

See the list of all blog posts from the conference

This was a joint session of the NY Java Sig and the ACGNJ group. Fun fact – they have the URLs javasig.com and javasig.org respectively.

Graal Vision and Architecture – Thomas at Oracle

  • Java is still the primary language on the JVM but lots of others.
  • Graal compiler runs on top of JVM and can run JVM languages.
  • Truffle Framework – allows running Ruby, R and JavaScript on JVM
  • Sulong runs on top of Truffle and adds support for C/C++
  • Can mix and match languages
  • Vision: become more polyglot and more embedable

Zing – Gil at Azul

  • Only company that builds nothing but JVMs
  • Zulu is Open JDK. Open JDK only produces source code; not binaries. Zulu is 100% open source. Differntiate for embedded platforms.
  • Zing is the differentiation, namely speed.
  • Gil went over the graph about optimization that we saw in his session earlier in the day
  • Falcon is the jit compiler
  • Logic to pre-tune so runs at speed right faster

Open J9 – Daniel at IBM

  • Number 1 cloud runtime
  • In cloud, memory costs more than CPU. Three times smaller than Open JDK in benchmark
  • Have stripped down JDK so smaller image
  • Trace engine and dump engine. Free diagnostics tools – important to be able to see what JVM is doing
  • Work with hardware vendors
  • Plan to open source J( before Java 9 launches

Selection of the Q&A

  • Why use JVM? IBM said #1 cloud JVM. Azul said Open JDK for and for best tuned for Zing. Oracle said can combine with other language or compile to native code. Also Oracle disputed the performance claim.
  • How important is polyglot? Azul said have to be able to beat existing runtime to be useful. IBM said tried to create the universal bytecode and didn’t work. Oracle said performing well. Oracle said there is interest because big investment in Java source code with business logic and want to use Node.JS for small apps so can reuse. Azul said hard because people have current tool in place. I wanted to ask why this over microservices. Azul and IBM both brought up that they think that is the future. Oracle said microservices are painful over just calling the data structures. Graal allows calling Java data structures from other languages now. Azul teased him that not in prod yet.
  • R becoming more popular due to machine learning. What about speed? Oracle noted that R is very slow and interpretted so Graal helps a lot
  • What about calling C from Java? Oracle said project Panama does that. A future version of Truffle will do that.
  • Who is working on optimizing regular bytecode? Source code knows more than the bytecode does such as generics. IBM looked at but creates new problem – expolding templates – use more memory that way. Azul mourned Java 5 not going that route.
  • Javac converting lambas to a virtul call. All three panelists immedidately said the JVM can tune that.
  • Do IBM clients have prod experience with J9? Yes. Been a product for 20 years and upgraded regularly.
  • How does Oracle manage different versions? Need to pick a version of the language, not mix and match. Can use interoperability of each run in own space.
  • How does IOT affect the memory footprint? IBM said Java might not be right choice for very memory constrained environment. Beyond that, stripped down JDK could be a good choice. Azul said Zulu embedded goes into things like routers and printers. Current boundary is 11-20 MB of storage and mid-high tens of MB to run. Happy JVM can’t run light bulbs given recent hack on light bulbs. Oracle looking at what parts of JRE using and turning those parts into machine code. Does contain GC, but not many other things. Has restrictions so can’t use things like generics/reflection.
  • Do any JVMs have hard limit on memory used? Azul said yes and again teased Oracle about their product not being in production. Azul also said elastic garbage collector so kernal gets memory back as soon as GC happens. IBM has soft MX so JVM doesn’t exceed the limit for the heap. Azul noted the problem is that JVMs have dedicated padding because might need later. Providing shared padding gives this confidence – dynamically expand and shrink “insurance memory”. IBM has detection for idle resources so other processes can use that memory as headroom
  • Is Java the right language for things that appear and go away due to warm up period – serverless? Azul said it should be and working on that problem now. Even with front loading, a lot of CPU sed on startup. Working on almost instant startup but that is future. IBM saves JIT status and profile code to decrease startup time as well. Need to keep JVM around for some length of time to minimize effect of cold starts. Oracle said can produce quick start if you restrict functionality used. Moving around program beocmes less expensive compared to moving around data. Azul said don’t want to limit features. IBM said AOT is a great bandaid to solve the startup problem.
  • What happens when reach limit on number of cores? Azul disagreed with question and cited we’ve been hearing about the end of Moores Law for ages. Speed over time still increasing. Oracle said never enough so people will want more machines.

java @ speed: making the most of modern hardware – live blogging from qcon

Java @Speed – Making the most of modern hardware
Speaker: Gil Tene
See the list of all blog posts from the conference

duct tape engineering should only be done when absolutely necessary

We think of speed as a number. But it’s not a quality without a context. Are you fast when you deploy? When at peak load? When the market opens? When acually trade? How long can you be fast in a row?

In Java, speed starts slow when app starts and gets faster until gets to steady point. Because the code changes over time. It starts out purely interpretted then optimizes after profiling. Also, GC pauses.

Modern servers

  • Number cores/chip has tripled
  • Instruction window keeps increasing
  • More parallelism each generation
  • Cache also increasing

Compilers

  • Can reorder code
  • Can remove dead code – nobody knowsif it ran the code. So can say did it; just really fast.
  • Values can be propagated – remove temporary variables
  • Can remove redundant code
  • Reads canbe cached – as if you extracted a variable. Use volatile if needs to avoid
  • Writes can be eliminated – can save calculation if doesn’t change
  • Can inline method call
  • Also does clever tricks lie checking for nulls only after SEGV happens. If you turn out to throw a lot of null pointers, deoptiizes to add guard clause
  • Class Hierarchy Analysis (CHA) – looks at whole code base for optimizations
  • Inlining works without final because knows no subclass. If a new subclass shows up, deoptimizes at that time.
  • If think only have one subclass, add guard clause and optimize. The guard clause will unoptimize
  • Deoptimizations create slowdown spikes in performance even during the optimized phase. Warmup isn’t always enough because warm up code might not hit all scenarios. “The one thing you haven’t done is trade.” So the first real trade is slow because it is deoptimization.
  • Azul has a product that logs optimizations and re-loads them on startup from prior runs.

Microbenchmarking is hard because some things are optimized away (like basic math). Use jmh from OpenJDK to microbenchmark, but still suspect everything.

I like that he showed the assembly code and explained the relationship to a simple for loop.

The 8 Nights of Java – Night 1

Given the holiday season, we thought it would be fun to share our favorite (or least favorite) features from all 8 versions of Java that have been released to date. Some features, like generics and autoboxing/unboxing, were met with a lot of fanfare and have since changed the way we as developers write code. Others, like NIO.1 and RMI, are not nearly as popular today as originally envisioned. With that in mind, we’ll be posting one entry each night on a different version of Java, starting tonight with Java 1.

Oh, we want to wish all of our readers a Happy and Healthy Holiday, as well as a Wonderful New Year!

Jump to: [Night 1 | Night 2 | Night 3 | Night 4 | Night 5 | Night 6 | Night 7 | Night 8]

Java 1.0/1.1 Notable Features
Sun introduced Java 1.0 (codename Oak) on January 23, 1996, with a more stable Java 1.1 released in February of the following year. These versions included:

  • Compiler and JVM runtime environment
  • JDBC
  • The beginnings of reflection
  • Inner classes
  • Thread class

From Scott:

Java 1.0 released.. well, Java! We all take for granted the ability to execute Java code on nearly every platform but back when Java was first released the idea of compiling source code into byte code and running inside a virtual machine was absolutely revolutionary. It was one of the first languages to allow developers to work freely in any environment and deploy to any other environment. Before Java, programmers tended to use the same operating system, same IDE software, and same compiler to write software that often could only run on a handful of environments. Java helped foster the open source movement allowing developers to work in Linux, Windows, MacOS, etc and deploy to any system. At a time when hardware and software systems were much more heterogeneous than they are today, Java helped improve productivity and sharing across a wide variety of landscapes.

Of course, early on there were some problems. Microsoft released Visual J++ alongside Sun’s implementation which contained only a subset of Sun’s features, as well as additional features Microsoft wrote themselves. These differences almost splintered the Java landscape within the first few years, since Visual J++ was so different from Sun’s implementation. Luckily for us, Sun intervened, successfully suing Microsoft on the basis that it violated Sun’s license agreement by releasing a version of Java not compatible with other versions of Java, helping to solidify Java as a standards-based language. In hindsight, articles like “Microsoft’s J/Direct called death of Java” written in 1997 (and comical now) would be the first of many (including a famous interview with Steve Jobs) to incorrectly predict Java’s demise. Today, Java is used in over 3 billion devices worldwide.

My favorite part of Java 1.0? The fact that the Thread class was included right away. Multi-threaded programming was still somewhat new, especially since multi-core processors were still in their infancy. Providing a new language in which developers could process tasks in parallel was pretty forward thinking, even if our thread-base implementations weren’t always perfect. Today, we tend to rely on the Concurrency API given its feature-rich convenience and stability, but never forget it’s is built entirely upon the Thread class.

From Jeanne:

Version 1.0 included Vector. While we no longer use Vector for new code, it paved the way for ArrayList and the Collections framework. JDBC is one of my favorite libraries. I use a mix of raw JDBC, ORM and Spring JDBC template these days, but JDBC started all of this. And then we have the parts of the language that stood the test of time. Plus since Sun/Oracle find it hard to actually get rid of anything from the language, we also have such fond deprecated memories such as Date’s getHours() methods. I was still in high school when Java 1 launched. I never actually worked with it directly as Java 1.2 was out before I started even reading about Java. When Java turned 10, it was cool to read Hello World(s) – From Code to Culture and see how Java got started. Or should I say how Oak got started?