News

[News] Latest Updates on HBM from the Leading Three Global Memory Manufacturers


2024-01-30 Semiconductors editor

Amid the AI trend, the significance of high-value-added DRAM represented by HBM continues to grow.

HBM (High Bandwidth Memory) is a type of graphics DDR memory that boasts advantages such as high bandwidth, high capacity, low latency, and low power consumption compared to traditional DRAM chips. It accelerates AI data processing speed and is particularly suitable for high-performance computing scenarios like ChatGPT, making it highly valued by memory giants in recent years.

Memory is also representing one of Korea’s pillar industries, and to seize the AI opportunity and drive the development of the memory industry, Korea has recently designated HBM as a national strategic technology.

The country will provide tax incentives to companies like Samsung Electronics. Small and medium-sized enterprises in Korea can enjoy up to a 40% to 50% reduction, while large enterprises like Samsung Electronics can benefit from a reduction of up to 30% to 40%.

Overview of HBM Development Progress Among Top Manufacturers

The HBM market is currently dominated by three major storage giants: Samsung, SK Hynix, and Micron. Since the introduction of the first silicon interposer HBM product in 2014, HBM technology has smoothly transitioned from HBM, HBM2, and HBM2E to HBM3 and HBM3e through iterative innovation.

According to research by TrendForce, the mainstream HBM in the market in 2023 is HBM2e. This includes specifications used in NVIDIA A100/A800, AMD MI200, and most CSPs’ self-developed acceleration chips. To meet the evolving demands of AI accelerator chips, various manufacturers are planning to launch new products like HBM3e in 2024, expecting HBM3 and HBM3e to become the market norm.

The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.

As for the higher-spec HBM4, TrendForce expects its potential launch in 2026. With the push for higher computational performance, HBM4 is set to expand from the current 12-layer (12hi) to 16-layer (16hi) stacks, spurring demand for new hybrid bonding techniques. HBM4 12hi products are set for a 2026 launch, with 16hi models following in 2027.

Meeting Demand, Manufacturers Actively Expand HBM Production

As companies like NVIDIA and AMD continue to introduce high-performance GPU products, the three major manufacturers are actively planning the mass production of HBM with corresponding specifications.

Previously, media reports highlighted Samsung’s efforts to expand HBM production capacity by acquiring certain buildings and equipment within the Samsung Display’s Cheonan facility.

Samsung plans to establish a new packaging line at the Cheonan plant dedicated to large-scale HBM production. The company has already invested KRW 10.5 trillion in the acquisition of the mentioned assets and equipment, with an additional investment of KRW 700 billion to KRW 1 trillion.

Micron Technology’s Taichung Fab 4 in Taiwan was officially inaugurated in early November 2023. Micron stated that Taichung Fab 4 would integrate advanced probing and packaging testing functions to mass-produce HBM3e and other products, thereby meeting the increasing demand for various applications such as artificial intelligence, data centers, edge computing, and the cloud. The company plans to start shipping HBM3e in early 2024.

In its latest financial report, SK Hynix stated that in the DRAM sector in 2023, its main products DDR5 DRAM and HBM3 experienced revenue growth of over fourfold and fivefold, respectively, compared to the previous year.

At the same time, in response to the growing demand for high-performance DRAM, SK Hynix will smoothly carry out the mass production of HBM3e for AI applications and the research and development of HBM4.

Read more

(Photo credit: SK Hynix)

Get in touch with us