Remember when object instantiation was such a big deal, EJBs made sense? It cost so many CPU cycles to allocate a new object, it made sense to cache objects in a pool and swap state in and out of them. I once saw a system that couldn’t stand up under load because of Object.newInstance.
So, I’ve done a lot with Java. Enough to have wandered into the depths of the JVM more than a few times. And friends, I’m never going back.
RAM and virtualized CPUs, cheaply and readily available through your cloud provider of choice, have made the JVM as obsolete as EJB. Node.JS is one of a new generation of single threaded execution environments helping to serve content.
Intelligent memory allocation strategies and horizontal CPU capacity allows for a stack of functions, each with everything needed right in memory, maximizing CPU cycles on as many CPUs as it takes. Literally getting things done as fast as possible.
This runtime environment also accurately models what’s happening at the hardware level. Making an HTTP request in code queues up some state on the bus, the CPU pulls a bit to notify the network adapter to make the call, the network adapter reads the packet off the bus and sends it on its way. Later, the network adapter pulls a bit to notify the CPU, and a response is delivered back to the code.
Linux has been asynchronous for many years, which has been making its way up the stack. Node.JS uses the V8 event loop to accurately model the underlying process, while taking away most concurrency concerns.
This isn’t necessarily a bad thing, add some continuous integration/continuous deployment into one of the auto-scaling cloud providers mentioned above and we’re starting to achieve a very high velocity tech environment.
Tim Fulmer can be reached at https://www.linkedin.com/in/timfulmer