Get ready for a revolution in enterprise computing. Until now we've had flash memory and DRAM as distinct entities, one for system storage, one for system memory. Both might now use solid-state technologies, but in terms of architecture, capabilities, costs and applications, they couldn't be more different. DRAM has always been fast, highly resilient and connected to the CPU by the fastest-available interconnect, but it has also been expensive, capacity-constrained and volatile the moment the power goes off, all the data is lost. Flash memory is cheaper per GB and wins on capacity, but even the fastest SSD technology can't compete with DRAM for high-speed bandwidth or overall read/write performance.
In the past, this wasn't a problem. When processors, applications and platform architecture were the compute bottlenecks, neither memory constraints nor SSD performance really held us back. Now, however, businesses are working with bigger datasets than ever before and striving to pull insights in near-real time from that data. They're investing in virtualisation, building flexibility and scalability into our IT, working to maximise the utilisation of every CPU core and system resource. They're building private and hybrid clouds designed to run next-generation business applications; applications that transform customer relationships and optimise all aspects of our operations. The streams of data running through are growing exponentially, and organisations need all the compute resources they can get their hands on to get the maximum value.
That's not such a problem for the world's largest enterprises, which have the capital to splash out on huge pools of DRAM, running vast multi-terabyte databases in-memory or maxing out the capacity of their hardware. However, medium-sized and smaller enterprises don't have that choice. DRAM is too expensive. Flash isn't nearly fast enough. In the new era of computing, the gap between flash memory and DRAM is preventing them from getting where they need to go.
Intel Optane DC Persistent Memory fills that gap. Designed to work with Intel's next-generation Intel Xeon Scalable Platinum processors, this new form of memory isn't just a big step forward, but potentially a bona-fide game changer.
Developed from Intel and Micron's cutting-edge 3D XPoint memory technology, Intel Optane DC Persistent Memory combines many of the advantages of DRAM and flash. The revolutionary architecture gives you higher densities and capacities than DRAM at a significantly lower cost. Yet it offers performance that's close to DRAM levels and far beyond anything conventional NAND-based flash technologies can deliver. The modules look and feel like DDR4 modules, albeit high-end performance modules with an enterprise-grade heatsink. However, where DDR4 capacities would normally stop at 64GB (vastly expensive) or 128GB (close to extortionate), Intel Optane DC Persistent Memory DIMMs start at 128GB and go up to 512GB and at a fraction of the price.
What's more, as the name suggests, this new kind of memory is persistent. Run it in the first of its operating modes, Memory Mode, and it acts just like a large-scale pool of DRAM memory, the Intel Xeon Scalable processor intelligently using DRAM to cache the most frequently accessed memory while the Optane DC Persistent Memory delivers bulk capacity. This is transparent at the application level, giving you the benefits of large memory capacity at a far more affordable cost.
Run it in the second operating mode, App Direct Mode, and the application and operating system work with the two types of memory independently, the Intel Xeon Scalable processor directing operations that require low latency to use the DRAM and those that need capacity to use the Optane DC Persistent Memory. In this case, the data held in the Optane DC Persistent Memory is retained even when the system is powered off or rebooted. While the OS and applications will need to be reloaded, in-memory databases and analytics frameworks may not.
What does this mean for enterprise computing? For a start, applications that involve both large datasets and heavyweight compute power can take advantage of an architecture that gives you huge capacity at the system memory level without the latency of moving data to and from the storage level. Artificial Intelligence, Machine Learning and High-Performance Computing applications will all see the benefits. We've already seen how organisations like the University of Pisa have harnessed the performance of Intel Optane storage; using memory-intensive undersampling techniques to shrink MRI examinations from forty minutes down to two. Intel Optane DC Persistent Memory should take this to another level.
It's a similar story with in-memory databases and big-data analytics or any application where you want to operate on a large dataset in close to real-time. Intel has claimed a 7x to 9.1x performance advantage when handling in-memory databases on an Intel Xeon Scalable Platinum 8170 and Optane DC Persistent Memory, as opposed to the same system with a more conventional DRAM plus SSD architecture. With Intel Optane DC Persistent Memory, you can put an entire Redis database in-memory and have sub-millisecond latency without the vast expense of doing so with DRAM. This has led Redis CTO and co-founder, Yiftach Shoolman, to say that we believe the next-generation server architecture will be all persistent memory. This is going to change the entire database market.'
And the impact will be every bit as big on virtualisation and, in turn, private and hybrid cloud. At the moment hardware servers offer a wealth of raw compute power, but are held back by the constraints of the DRAM that support them; you might have the cores and cycles to deploy more VMs or containers, but that means little if you don't have the RAM to do so. With Intel Optane DC Persistent Memory, you can add system memory in much greater densities, running up to three times as many virtual machines on the same hardware or as many as four times the number of database containers. For organisations deploying their own private or hybrid cloud, it becomes much, much easier to create pools of compute and storage resources that can be provisioned and scaled on demand. And as this memory is persistent, shutting down or rebooting a server doesn't have to mean serious downtime. In fact, reboot times change from a matter of minutes into seconds.
This isn't simply changing the way enterprises run mission-critical high-performance applications or operate virtualised IT, but also changing which enterprises can do so. By bringing down the cost of running some of the world's most memory-intensive applications, Intel Optane DC Persistent Memory brings them in reach of a much wider range of organisations. Startups and medium-sized enterprises aren't shut out of the A.I. and Machine Learning revolutions. They can harness next-generation analytics to compete. If Intel's new memory tech lives up to expectations, we could be seeing the start of a new era for the datacentre.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
ITPro is a global business technology website providing the latest news, analysis, and business insight for IT decision-makers. Whether it's cyber security, cloud computing, IT infrastructure, or business strategy, we aim to equip leaders with the data they need to make informed IT investments.
For regular updates delivered to your inbox and social feeds, be sure to sign up to our daily newsletter and follow on us LinkedIn and Twitter.