As August beckons let’s take a quick (4:14) look at the highlights of the latest news in HPC, AI, quantum and other advanced technologies. This week, Shahin and Doug discuss: AWS EC2 P5 cloud instance with Nvidia H100 and AMD Milan; TACC’s Stampede-3 mini Intel Aurora with Cornelis Network’s Omni-Path Express fabric; Micron 8-high 24GB HBM3; Cineca’s “white space” supercomputing infrastructure strategy.
HPC News Bytes Podcast for 20230731: AWS’s GPU-Laden P5 Instance; TACC’s Stampede-3; Micron’s 24GB HBM3; Cineca’s ‘White Space’ Infrastructure
SK hynix Says It Has Developed First 12-Layer HBM3
Seoul, April 20, 2023 – SK hynix Inc. announced today it has become the industry’s first to develop 12-layer HBM3 product with a 24 gigabyte (GB) memory capacity, currently the largest in the industry, and said customers’ performance evaluation of samples is underway. HBM (High Bandwidth Memory) is a high-performance memory that vertically interconnects multiple […]
SK hynix to Supply HBM3 DRAM to NVIDIA
Seoul, June 9, 2022 – SK hynix announced that it began mass production of HBM3, high-performance memory that vertically interconnects multiple DRAM chips “and dramatically increases data processing speed in comparison to traditional DRAM products,” the company said. HBM3 DRAM is the 4th generation HBM product, succeeding HBM (1st generation), HBM2 (2nd generation) and HBM2E […]