AMD launched the new AMD InstinctTM MI200 series, noted as the first exascale-class GPU accelerators, through a virtual Accelerated Data Center Premiere. Rated as the world’s fastest accelerator used for high-performance computing (HPC), and artificial intelligence (AI), AMD InstinctTM MI250X.

Built on AMD CDNATM 2 architecture, the Instinct MI200 series accelerators promise improvement in terms of application for a broad set of HPC workloads. The Instinct MI250X accelerator delivers up to 4.9X performance improvement compared to available competitive accelerators for double precision (FP64) HPC applications, and surpasses 380 teraflops of peak theoretical half-precision (FP16) for AI workloads, enabling disruptive approaches in accelerating data-driven research.

AMD Instinct MI200 accelerators deliver leadership HPC and AI performance, helping scientists make generational leaps in research that can dramatically shorten the time between initial hypothesis and discovery. With key innovation in architecture, packaging and system design, the AMD Instinct MI200 series accelerators are the most advance data center GPUs ever, providing exceptional performance for supercomputers and data centers to solve the world’s most complex problems.

by Forrest Norrod, Senior Vice President and General Manager, Data Center and Embedded Solutions Business Group, AMD.

Exascale with AMD

Collaborating with U.S. Department of Energy, Oak Ridge National Laboratory, and HPE, designed the Frontier supercomputer expected to deliver more than 1.5 exaflops of peak computing power. The supercomputer powered by the well-optimized 3rd Gen AMD EPYCTM CPUs and AMD Instinct MI25X accelerators. Adding further, Frontier pushes the boundaries of scientific discovery by enhancing AI performance in terms of analytics and simulation at scale, helping scientists to conduct further calculations to identify new patterns in data, and develop innovative data analysis methods to accelerate the pace of scientific discovery.  

Powering the Future of HPC

The AMD Instinct MI200 series accelerators, designed by combining the 3rd Gen AMD EPYC CPUs and the ROCmTM 5.0 open software platform to propel new discoveries for the exascale era and tackle our most pressing challenges from climate change to vaccine research.

The following are the features of AMD Instinct MI200 series accelerators: –

  • AMD CDNA™ 2 architecture – 2nd Gen Matrix Cores accelerating FP64 and FP32 matrix operations, delivering up to 4X the peak theoretical FP64 performance vs. AMD previous gen GPUs. ,3,4
  • Leadership Packaging Technology – Industry-first multi-die GPU design with 2.5D Elevated Fanout Bridge (EFB) technology delivers 1.8X more cores and 2.7X higher memory bandwidth vs. AMD previous gen GPUs, offering the industry’s best aggregate peak theoretical memory bandwidth at 3.2 terabytes per second. 4,5,6
  • 3rd Gen AMD Infinity Fabric™ technology – Up to 8 Infinity Fabric links connect the AMD Instinct MI200 with 3rd Gen EPYC CPUs and other GPUs in the node to enable unified CPU/GPU memory coherency and maximize system throughput, allowing for an easier on-ramp for CPU codes to tap the power of accelerators.

Software for Enabling Exascale Science

AMD ROCmTM is an open software platform that allows researchers to utilize the power of AMD InstinctTM accelerators to drive new scientific discoveries. The ROCmTM platform built over the foundation of open portability, supporting environments across multiple accelerator vendors and architectures. Combined with ROCm 5.0, AMD decides to extend its open platform powering the top HPC and AI applications with AMD Instinct MI200 series accelerators, increasing the accessibility of ROCm for developers and delivering leadership performance across key workloads.

We are in a high-performance computing megacycle that is driving demand for more compute to power the services and devices that impact every aspect of our daily lives. We are building significant momentum in the data center with our leadership product portfolio, including Meta’s adoption of AMD EPYC to power their infrastructure and the buildout of Frontier, the first U.S. exascale supercomputer which will be powered by EPYC and AMD Instinct processors. In addition, today we announced a breadth of new products that build on that momentum in next-generation EPYC processors with new innovations in design, leadership, 3D packaging technology, and 5nm high-performance manufacturing to further extend our leadership for cloud, enterprise, and HPC customers.”

by Dr. Lisa Su, President and CEO, AMD.

Researchers, data scientists, and end-users can utilize AMD’s Infinity Hub to access, download, and install containerized HPC apps and ML frameworks that optimized to support AMD Instinct accelerators and ROCm. Currently, the hub provide access to ranges of containers supporting either Radeon InstinctTM MI50, AMD InstinctTM MI100 or AMD Instinct MI200 accelerators including several applications like Chroma, CP2k, LAMMPS, NAMD, OpenMM, et cetera, along with the popular ML frameworks TensorFlow and PyTorch. New containers added to the hub from time to time.

The following table shows MI200 Series Specs: –

ModelsCompute UnitsStream ProcessorsFP64 | FP32 Vector (Peak)FP64 | FP32 Matrix (Peak)FP16 | bf16 (Peak)INT4 | INT8 (Peak)HBM2e
ECC
Memory
Memory BandwidthForm Factor
AMD Instinct MI250x22014,080Up to 47.9 TFUp to 95.7 TFUp to 383.0 TFUp to 383.0 TOPS128GB3.2 TB/secOCP Accelerator Module
AMD Instinct MI25020813,312Up to 45.3 TFUp to 90.5 TFUp to 362.1 TFUp to 362.1 TOPS128GB3.2 TB/secOCP Accelerator Module
Table 1: AMD MI200 Specifications.

The AMD Instinct MI250X and Instinct MI250 available in the open-hardware compute accelerator module or OCP Accelerator Module (OAM) form factor. The AMD Instinct MI210 will be available in a PCIe card form factor in OEM servers. The MI250X accelerator is available from HPE in the HPE Cray EX Supercomputer, and additional AMD Instinct MI200 series accelerators will be available in the system from major OEM and ODM partners in the enterprise markets in Q1 2022, this includes ASUS, ATOS, Dell Technologies, Gigabyte, Hewlett Packard Enterprise (HPE), Lenovo, Penguin Computing and Supermicro.