While making strides in the HBM capacity expansion at its Cheongju M15X fab, SK hynix may have locked in another major client in addition to NVIDIA. According to South Korean media outlet The Elec, the memory giant has reportedly landed a substantial order to provide HBM to Broadcom.
Sources cited by The Elec indicate that SK hynix is set to deliver HBM to Broadcom in the latter half of 2025, and the U.S. chipmaker will likely use the memory chips in AI computing solutions for a Big Tech client.
SK hynix initially planned to increase its 1b DRAM production capacity to 140 – 150K 300mm wafers per month by 2025, as the report noted. However, the recent deal with Broadcom is expected to push this target higher, reaching 160 – 170K 300mm wafers per month.
Notably, The Elec notes that SK hynix may tend to postpone the installation of its 1c DRAM equipment, the successor to 1b DRAM, to address the surge in demand driven by Broadcom.
Major CSPs have become increasingly ambitious on developing their own AI infrastructures, thus driving the demand for AI-specific ASICs and HBM. For instance, Google has reportedly integrated HBM3E into Trillium, its self-developed 6th-generation TPU, as AWS also utilizes HBM in its self-developed Trainium chipset for AI learning purposes.
Earlier this month, Broadcom announced it is collaborating with three major cloud service providers – likely Google, Meta, and ByteDance – on the development of AI chips, as noted by The Elec.
Read more
(Photo credit: SK hynix)