Recently, South Korean media Alphabiz reported that Samsung may exclusively supply 12-layer HBM3e to NVIDIA.
The report indicates NVIDIA is set to commence large-scale purchases of Samsung Electronics’ 12-layer HBM3e as early as September, who will exclusively provide the 12-layer HBM3e to NVIDIA.
NVIDIA CEO Jensen Huang, as per Alphabiz reported, left his signature “Jensen Approved” on a physical 12-layer HBM3e product from Samsung Electronics at GTC 2024, which seems to suggest NVIDIA’s recognition of Samsung’s HBM3e product.
HBM is characterized by its high bandwidth, high capacity, low latency, and low power consumption. With the surge in artificial intelligence (AI) industry, the acceleration of AI large-scale model applications has driven the continuous growth of demand in high-performance memory market.
According to TrendForce’s data, HBM market value accounted for approximately 8.4% of the overall DRAM industry in 2023, and this percentage is projected to expand to 20.1% by the end of 2024.
Senior Vice President Avril Wu notes that by the end of 2024, the DRAM industry is expected to allocate approximately 250K/m (14%) of total capacity to producing HBM TSV, with an estimated annual supply bit growth of around 260%.
HBM3e: Three Major Original Manufacturers Kick off Fierce Rivalry
Following the debut of the world’s first TSV HBM product in 2014, HBM memory technology has now iterated to HBM3e after nearly 10 years of development.
From the perspective of original manufacturers, competition in the HBM3e market primarily revolves around Micron, SK Hynix, and Samsung. It is reported that these three major manufacturers already provided 8-hi (24GB) samples in late July, mid-August, and early October 2023, respectively. It is worth noting that this year, they have kicked off fierce competition in the HBM3e market by introducing latest products.
On February 27th, Samsung announced the launch of its first 12-layer stacked HBM3e DRAM–HBM3e 12H, which marks Samsung’s largest-capacity HBM product to date, boasting a capacity of up to 36GB. Samsung stated that it has begun offering samples of the HBM3e 12H to customers and anticipates starting mass production in the second half of this year.
In early March, Micron announced that it had commenced mass production of its HBM3e solution. The company stated that the NVIDIA H200 Tensor Core GPU will adopt Micron’s 8-layer stacked HBM3e memory with 24GB capacity and shipments are set to begin in the second quarter of 2024.
On March 19th, SK Hynix announced the successful large-scale production of its new ultra-high-performance memory product, HBM3e, designed for AI applications. This achievement symbolizes the world’s first supply of DRAM’s highest-performance HBM3e in existence to customers.
A previous report from TrendForce has indicated that, starting in 2024, the market’s attention will shift from HBM3 to HBM3e, with expectations for a gradual ramp-up in production through the second half of the year, positioning HBM3e as the new mainstream in the HBM market.
TrendForce reports that SK hynix led the way with its HBM3e validation in the first quarter, closely followed by Micron, which plans to start distributing its HBM3e products toward the end of the first quarter, in alignment with NVIDIA’s planned H200 deployment by the end of the second quarter.
Samsung, slightly behind in sample submissions, is expected to complete its HBM3e validation by the end of the first quarter, with shipments rolling out in the second quarter. With Samsung having already made significant strides in HBM3 and its HBM3e validation expected to be completed soon, the company is poised to significantly narrow the market share gap with SK Hynix by the end of the year, reshaping the competitive dynamics in the HBM market.
Read more
(Photo credit: SK Hynix)