This ESG Technical Validation documents remote testing of Red Hat OpenShift Container Storage with a focus on the ease of use and breadth of data services. Containers have become an important part of data center modernization. They simplify building, packaging, and deploying applications, and are hardware agnostic and designed for agility—they can run on physical, virtual, or cloud infrastructure and can be moved around as needed.
Simplifying Persistent Container Storage for the Open Hybrid Cloud
This ESG Technical Validation documents remote testing of Red Hat OpenShift Container Storage with a focus on the ease of use and breadth of data services. Containers have become an important part of data center modernization. They simplify building, packaging, and deploying applications, and are hardware agnostic and designed for agility—they can run on physical, virtual, or cloud infrastructure and can be moved around as needed.
Radio Free HPC: The Persistence of Memory
In this episode, we drill down on what Intel is doing with their cool Optane memory tech, shooting for speeds that remind you of memory, sizes that look like storage, and costs that make it look like a deal, with real byte-addressable persistent memory right inside the server – or block-addressable, if you want. This is a space that was bound to get filled and we’ve been watching the industry’s progress.
Characteristics of Remote Persistent Memory – Performance, Capacity, or Locality?
Paul Grun from Cray gave this talk at the OpenFabrics Workshop in Austin. “Persistent Memory exhibits several interesting characteristics including persistence, capacity and others. These (sometimes)competing characteristics may require system and server architects to make tradeoffs in system architecture. In this session, we explore some of those tradeoffs and take an early look at the emerging use cases for Remote Persistent Memory and how those may impact network architecture and API design.”
Video: Reimagining the Datacenter Memory and Storage Hierarchy
In this video from DDN booth at SC18, Andrey Kudryavtsev from Intel presents: Reimagining the Data Centre Memory and Storage Hierarchy. “Intel Optane DC persistent memory represents a new class of memory and storage technology architected specifically for data center usage. One that we believe fundamentally breaks through some of the constricting methods for using data that have governed computing for more than 50 years.”
Video: Will Persistent Memory solve your Performance Problems?
In this video, Thomas Wilhelm shows software developers how they can optimize their application in order to identify if they can benefit from Persistent memory. “Persistent memory is adding a completely novel memory tier to the memory hierarchy that fits between DRAM and SSDs. In this video, we will show software developers how they can analyze their application in order to identify if they will benefit from persistent memory.”
Realizing Exabyte-scale PM Centric Architectures and Memory Fabrics
Zvonimir Bandic from Western Digital gave this talk at the SNIA Persistent Memory Summit. “Much has been debated about would it take to scale a system to exabyte main memory with the right levels of latencies to address the world’s growing and diverse data needs. This presentation will explore legacy distributed system architectures based on traditional CPU and peripheral attachment of persistent memory, scaled out through the use of RDMA networking.”
Persistent Memory Programming: The Current State of the Ecosystem
“In this presentation, Andy Rudoff from Intel reports on the latest developments around persistent memory programming. He’ll describing current discussions in the SNIA NVM Programming Technical Work Group, the current state of operating system support, recent tool and library development, and finally he’ll describe some of the upcoming challenges for high performance persistent memory use.”
Video: How Persistent Memory Will Bring an Entirely New Structure to Large Data Computing
“As data proliferation continues to explode, computing architectures are struggling to get the right data to the processor efficiently, both in terms of time and power. But what if the best solution to the problem is not faster data movement, but new architectures that can essentially move the processing instructions into the data? Persistent memory arrays present just such an opportunity. Like any significant change, however, there are challenges and obstacles that must be overcome. Industry veteran Steve Pawlowski will outline a vision for the future of computing and why persistent memory systems have the potential to be more revolutionary than perhaps anyone imagines.”