Real-time memory moving in
First Published 16th July 2014
Java may not be an emerging technology, but a lot has been happening in the last few years to push the popular language towards lower latency.
Gil Tene, Azul Systems
"Java is fast, but with terrible, terrible pauses."
London - Panelists were on hand July 15 at an event hosted by Azul Systems to talk about how real-time business pressures are leading to the adoption of in-memory computing technologies.
Adding to those pressures are cheaper components against a backdrop of more efficient organisation and use of databases and storage, said panel moderator John Abbott, founder and analyst at 451 Research. "All of these things have got quite profound effects on all sorts of software layers," he added.
But does it make sense to consider Java within the scope of "real-time"?
CTO of Azul Systems, Gil Tene, said that the real-time embedded space is not about speed but ultimately reliability. Business real-time, he added, is about using timely information that affects the way business runs, for example in risk analytics. Moreover, a number of decision processes using the most current information are being incorporated across sectors.
"We see it in trading, we see it in retail, we see it in inventory management, we see it across the board," he said. "What you are working on is based on current information."
Joining Tene on the panel were Peter Lawrey, CEO of technology advisory firm Higher Frequency Trading, and Mark Little, VP of Engineering at Red Hat and CTO of JBoss Middleware.
Lawrey works with hedge funds and said that predictability is extremely important to such clients.
"Real-time (in Java) just means that you concentrate on all seconds of the day. The worst second of the day is still acceptable (even if) it may be worse than the average," he said.
That might sound like anathema to the ultra-low latency crowd. After all, FPGA technologies are prized in part for deterministic outcomes. But not every trading firm requires light speed trading, a nuance that extends into the big data space.
Red Hat's Mark Little said in low latency, the more hops you have the worse performance becomes. This, he added, is where a large in-memory solution helps ensure that all the memory is available and can be quickly serviced -- a major departure from cache mentality.
"Having the entire data set (available) means you do not expire your data and just keep the useful portion of it," he said.
It's these kinds of developments putting big data in the forefront as a major technological challenge to overcome. But Tene pointed out that "medium-sized data" shouldn't be ignored.
"I find these very interesting, middle-sized, respectably large but not too large data sets," Tene said. "Whether it is retail, or risk analysis, or order management systems - it's the same notion, the entire thing fits into your memory," he said.
Getting developers to think in real-time, no matter what the language is, will require a shift in mindset however. "How do we provide to developers the ability to think about time? Whether soft or hard, it needs to be there from the get-go," said Little.
Speaking to Automated Trader afterwards, Gil Tene said that Java deserves some of its bad reputation. "Java is fast, but with terrible, terrible pauses," he said. But when it comes to considerations such as cost and time-to-market, it beats out other alternatives at a time when solutions for reliability are available, he added.
See our full video interview here: