News
Amid the intensifying HBM race, Micron has secured a spot with its HBM3E 12H designed into NVIDIA’s GB300. Notably, according to its press release, the U.S. memory giant is also the only company shipping both HBM3E and SOCAMM memory for AI servers, reinforcing its leadership in low-power DDR for data centers.
According to the Korean Herald, Micron has surprised the industry as it announced the mass production of SOCAMM—dubbed the “second HBM”—ahead of SK hynix. At NVIDIA’s GTC 2025, SK hynix showcased its SOCAMM prototype, while Samsung previously stated in an interview that it was working with clients on validation, hinting at future commercialization, the report suggests.
Micron notes that its SOCAMM LPDDR5X, developed with NVIDIA, supports the GB300 Grace Blackwell Ultra Superchip. Meanwhile, its HBM3E 12H 36GB powers NVIDIA’s HGX B300 NVL16 and GB300 NVL72, while the 8H 24GB is used in the HGX B200 and GB200 NVL72.
The Korean Herald report notes that SOCAMM is an LPDDR-based memory module optimized for AI servers, with NVIDIA leading its standardization. Unlike traditional DDR used in servers, SOCAMM uses LPDDR for greater power efficiency, cutting energy consumption by two-thirds, as per the Korean Herald.
According to Micron’s press release, SOCAMM outperforms traditional RDIMMs with over 2.5x higher bandwidth, one-third the power consumption, and a significantly smaller 14x90mm form factor, enabling compact and efficient server designs. It also offers the highest LPDDR5X capacity, with 128GB per module, enhancing AI model training and inference performance.
Read more
(Photo credit: Micron)