AMD’s next-generation Intuition ‘Aldebaran’ compute graphics playing cards, based mostly on the CDNA2 structure, has begun transport as detailed in AMD’s Q2 earning report presentation.
The Energy Of Up To 256 Compute Models & 128GB Of HBM2E Is In The Fingers Of Clients With The AMD Intuition MI200 ‘Aldebaran’ GPU
The AMD Intuition MI200 ‘Aldebaran’ is the successor to the Intuition MI100 ‘Arcturus’ and is already making its approach into buyer’s palms. On the backside of the slide, AMD says,
Preliminary shipments of next-generation AMD Intuition accelerators that includes 2nd Gen CDNA structure.
The AMD CDNA 2 structure will probably be powering the next-generation AMD Intuition HPC accelerators. We all know that a type of accelerators would be the MI200 which can function the Aldebaran GPU. It is a very highly effective chip and the primary GPU to function an MCM design. The Intuition MI200 competes in opposition to Intel’s 7nm Ponte Vecchio and NVIDIA’s refreshed Ampere elements. Intel and NVIDIA are additionally following the MCM route on their next-generation HPC accelerators nevertheless it seems like Ponte Vecchio goes to be accessible in 2022 and the identical could be stated for NVIDIA’s next-gen HPC accelerator as their very own roadmap confirmed.
Contained in the AMD Intuition MI200 is an Aldebaran GPU that includes two dies, a secondary and a main. It has two dies with every consisting of 8 shader engines for a complete of 16 SE’s. Every Shader Engine packs 16 CUs with full-rate FP64, packed FP32 & a 2nd Technology Matrix Engine for FP16 & BF16 operations. Every die, as such, consists of 128 compute models or 8192 stream processors. This rounds as much as a complete of 256 compute models or 16,384 stream processors for your complete chip. The Aldebaran GPU can also be powered by a brand new XGMI interconnect. Every chiplet incorporates a VCN 2.6 engine and the primary IO controller.
As for DRAM, AMD has gone with an 8-channel interface consisting of 1024-bit interfaces for an 8192-bit extensive bus interface. Every interface can help 2GB HBM2e DRAM modules. This could give us as much as 16 GB of HBM2e reminiscence capability per stack and since there are eight stacks in complete, the entire quantity of capability can be a whopping 128 GB. That is 48 GB greater than the A100 which homes 80 GB HBM2e reminiscence. The complete visualization of the Aldebaran GPU on the Intuition MI200 is accessible right here.
With prospects already receiving the AMD Intuition MI200 ‘Aldebaran’, AMD has cemented itself as the primary to ship graphics playing cards with a Multi-Chip-Module design and is predicted to increase this design course of to shopper playing cards with RDNA 3 in 2022.