The U.S. memory giant Micron Technology has started the mass production of high-bandwidth memory “HBM3e,” which will be utilized in NVIDIA’s latest AI chips.
Micron stated on February 26th that HBM3e consumes 30% less power than its competitors, meeting the demands of generative AI applications. Micron’s 24GB 8H HBM3e will be part of NVIDIA’s “H200” Tensor Core GPUs, breaking the previous exclusivity of SK Hynix as the sole supplier for the H100.
Per TrendForce’s earlier research into the HBM market, it has indicated that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management. The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.
As per a previous report from NVIDIA last year, the H200 is scheduled to ship in the second quarter of this year (2024), replacing the current most powerful H100 in terms of computing power. Micron’s press release on February 26th has further solidified that Micron will begin shipping its 24GB 8H HBM3e in the second calendar quarter of 2024.
In the same press release, Micron’s Chief Business Officer, Sumit Sadana, has also indicated that“AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3e and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications.”
HBM is one of Micron’s most profitable products, and its complex construction process is part of the reason. Micron previously predicted that HBM revenue could reach hundreds of millions of dollars in 2024, with further growth expected in 2025.
Micron has further announced that it will share more about its industry-leading AI memory portfolio and roadmaps at the “GPU Technology Conference” (also known as the GTC conference) hosted by NVIDIA on March 18th.
Previously, Micron indicated in a December 2023 conference call with investors that generative AI could usher in a multi-year growth period for the company, with projected memory industry revenue reaching historic highs in 2025. They also mentioned at the time that HBM3e, developed for NVIDIA’s H200, had entered its final quality control phase.
Read more
(Photo credit: Micron)