News
During the “SEMICON Korea 2024” event held recently in Seoul, Chun-hwan Kim, Vice President of global memory giant SK hynix, revealed that the company’s HBM3e has entered mass production, with plans to commence large-scale production of HBM4 in 2026.
According to a report from Business Korea, Chun-hwan Kim stated that SK hynix’s HBM3e memory is currently in mass production, with plans to initiate mass production of HBM4 in 2026.
He noted that with the advent of the AI computing era, generative AI is rapidly advancing, and the market is expected to grow at a rate of 35% annually. The rapid growth of the generative AI market requires a significant number of higher-performance AI chips to support it, further driving the demand for higher-bandwidth memory.
He further commented that the semiconductor industry would face intense survival competition this year to meet the increasing demand and customer needs for memory.
Kim also projected that the HBM market would grow by 40% by 2025, with SK hynix already strategically positioning itself in the market and planning to commence production of HBM4 in 2026.
Meanwhile, previous reports have also indicated that SK hynix expected to establish an advanced packaging facility in the state of Indiana, USA, to meet the demands of American companies, including NVIDIA.
Driven by the wave of AI advancement and demand from China, the Ministry of Trade, Industry and Energy of South Korea recently announced that South Korea’s semiconductor product exports experienced a rebound in 2024. In January, exports reached approximately USD 9.4 billion, marking a year-on-year increase of 56.2% and the largest growth in 73 months.
TrendForce has previously reported the progress of HBM3e, as outlined in the timeline below, which shows that SK hynix already provided its 8hi (24GB) samples to NVIDIA in mid-August.
Read more
(Photo credit: SK hynix)
News
Amid the AI trend, the significance of high-value-added DRAM represented by HBM continues to grow.
HBM (High Bandwidth Memory) is a type of graphics DDR memory that boasts advantages such as high bandwidth, high capacity, low latency, and low power consumption compared to traditional DRAM chips. It accelerates AI data processing speed and is particularly suitable for high-performance computing scenarios like ChatGPT, making it highly valued by memory giants in recent years.
Memory is also representing one of Korea’s pillar industries, and to seize the AI opportunity and drive the development of the memory industry, Korea has recently designated HBM as a national strategic technology.
The country will provide tax incentives to companies like Samsung Electronics. Small and medium-sized enterprises in Korea can enjoy up to a 40% to 50% reduction, while large enterprises like Samsung Electronics can benefit from a reduction of up to 30% to 40%.
Overview of HBM Development Progress Among Top Manufacturers
The HBM market is currently dominated by three major storage giants: Samsung, SK Hynix, and Micron. Since the introduction of the first silicon interposer HBM product in 2014, HBM technology has smoothly transitioned from HBM, HBM2, and HBM2E to HBM3 and HBM3e through iterative innovation.
According to research by TrendForce, the mainstream HBM in the market in 2023 is HBM2e. This includes specifications used in NVIDIA A100/A800, AMD MI200, and most CSPs’ self-developed acceleration chips. To meet the evolving demands of AI accelerator chips, various manufacturers are planning to launch new products like HBM3e in 2024, expecting HBM3 and HBM3e to become the market norm.
The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.
As for the higher-spec HBM4, TrendForce expects its potential launch in 2026. With the push for higher computational performance, HBM4 is set to expand from the current 12-layer (12hi) to 16-layer (16hi) stacks, spurring demand for new hybrid bonding techniques. HBM4 12hi products are set for a 2026 launch, with 16hi models following in 2027.
Meeting Demand, Manufacturers Actively Expand HBM Production
As companies like NVIDIA and AMD continue to introduce high-performance GPU products, the three major manufacturers are actively planning the mass production of HBM with corresponding specifications.
Previously, media reports highlighted Samsung’s efforts to expand HBM production capacity by acquiring certain buildings and equipment within the Samsung Display’s Cheonan facility.
Samsung plans to establish a new packaging line at the Cheonan plant dedicated to large-scale HBM production. The company has already invested KRW 10.5 trillion in the acquisition of the mentioned assets and equipment, with an additional investment of KRW 700 billion to KRW 1 trillion.
Micron Technology’s Taichung Fab 4 in Taiwan was officially inaugurated in early November 2023. Micron stated that Taichung Fab 4 would integrate advanced probing and packaging testing functions to mass-produce HBM3e and other products, thereby meeting the increasing demand for various applications such as artificial intelligence, data centers, edge computing, and the cloud. The company plans to start shipping HBM3e in early 2024.
In its latest financial report, SK Hynix stated that in the DRAM sector in 2023, its main products DDR5 DRAM and HBM3 experienced revenue growth of over fourfold and fivefold, respectively, compared to the previous year.
At the same time, in response to the growing demand for high-performance DRAM, SK Hynix will smoothly carry out the mass production of HBM3e for AI applications and the research and development of HBM4.
Read more
(Photo credit: SK Hynix)
News
Major Cloud Service Providers (CSPs) continue to see an increase in demand for AI servers over the next two years. The latest projections of TrendForce indicate a global shipment of approximately 1.18 million AI servers in 2023, with a year-on-year growth of 34.5%. The trend is expected to persist into the following year, with an estimated annual growth of around 40.2%, constituting over 12% of the total server shipments.
NVIDIA, with its key products including AI-accelerating GPU and the AI server reference architecture HGX, currently holds the highest market share in the AI sector. However, it is crucial to monitor CSPs developing their own chips and, in the case of Chinese companies restricted by U.S. sanctions, expanding investments in self-developed ASICs and general-purpose AI chips.
According to TrendForce data, AI servers equipped with NVIDIA GPUs accounted for approximately 65.1% this year, projected to decrease to 63.5% next year. In contrast, servers featuring AMD and CSP self-developed chips are expected to increase to 8.2% and 25.4%, respectively, in the coming year.
Another critical application, HBM (High Bandwidth Memory), is primarily supplied by major vendors Samsung, SK Hynix, and Micron, with market shares of approximately 47.5%, 47.5%, and 5.0%, respectively, this year. As the price difference between HBM and DDR4/DDR5 is 5 to 8 times, this is expected to contribute to a staggering 172% year-on-year revenue growth in the HBM market in 2024.
Currently, the three major manufacturers are expected to complete HBM3e verification in the first quarter of 2024. However, the results of each manufacturer’s HBM3e verification will determine the final allocation of procurement weight for NVIDIA among HBM suppliers in 2024. As the verifications are still underway, the market share for HBM in 2024 remain to be observed.
Read more
(Photo credit: NVIDIA)
News
Micron Technology, the U.S. memory giant, has surpassed Wall Street expectations in its projected revenue for the current quarter (December-February). The main factor contributing to this success is the robust demand from data centers, offsetting the sluggish recovery in the PC and smartphone markets.
According to Micron’ released fiscal report for their first quarter (from August to November, 2023) on December 20th, Micron’s revenue rose from USD 4.01 billion in the same period last year to USD 4.73 billion.
Looking ahead to the current quarter (Q2), Micron anticipates revenue reaching USD 5.3 billion ± USD 200 million and diluted loss per share reaching USD 0.28 ± USD 0.07, which are better than market’s consensus.
Micron CEO Sanjay Mehrotra noted that strong execution and pricing strategies contributed to Q1 financial results surpassing expectations. He further stated that, ‘Demand for AI servers has been strong as data center infrastructure operators shift budgets from traditional servers to more content-rich AI servers.’
Mehrotra indicated that Micron is in the final stages of qualifying HBM3e to be used in NVIDIA’s next generation Grace Hopper GH200 and H200 platforms.
Micron now predicts that PC sales are expected to grow by a low to mid-single-digit percentage in calendar 2024, after two years of double-digit percentage PC unit volume declines. There is also hope for smartphone unit shipments to grow modestly in 2024.
For the HBM market, TrendForce’s latest research indicates that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management. Samsung’s HBM3 (24GB) is anticipated to complete verification with NVIDIA by December this year.
The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.
Given the intricacy of the HBM verification process—estimated to take two quarters—TrendForce expects that some manufacturers might learn preliminary HBM3e results by the end of 2023. However, it’s generally anticipated that major manufacturers including Samsung, SK Hynix and Micron will have definite results by 1Q24. Notably, the outcomes will influence NVIDIA’s procurement decisions for 2024, as final evaluations are still underway.
Read more
(Photo credit: Micron)
News
According to the expreview’s report, due to the surge in demand for AI applications and the market’s need for more powerful solutions, NVIDIA plans to shorten the release cycle of new products from the original 2 years to 1 year. Regarding its HBM partner, although validations for various samples are still in progress, market indications lean towards SK Hynix securing the ultimate HBM3e supply contract.
In a recent investor presentation, NVIDIA revealed its product roadmap, showcasing the data center plans for 2024 to 2025. The release time for the next-generation Blackwell architecture GPU has been moved up from Q4 2024 to the end of Q2 2024, with plans for the subsequent “X100” after its release in 2025.
According to Business Korea, NVIDIA has already signed a prioritized supply agreement with SK Hynix for HBM3e, intended for the upcoming GPU B100.
While NVIDIA aims for a diversified supply chain, it has received HBM3e samples from Micron and Samsung for verification testing, and formal agreements are expected after successful validation. However, industry insiders anticipate that SK Hynix will likely secure the initial HBM3e supply contract, securing the largest share.
With this transaction, SK Hynix’s revenue for the fourth quarter of the 2023 fiscal year is poised to surpass KRW 10 trillion, marking a resurgence after a hiatus of 1 year and 3 months.
In the upcoming NVIDIA products scheduled for release next year, the newly added H200 and B100 will incorporate 6 and 8 HBM3e modules, respectively. As NVIDIA’s product line transitions to the next generation, the usage of HBM3e is expected to increase, contributing to SK Hynix’s profit potential.
SK Hynix is actively engaged in the development of HBM4, aiming to sustain its competitive edge by targeting completion by 2025.
TrendForce’s earlier research into the HBM market indicates that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management. The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.
Given the intricacy of the HBM verification process—estimated to take two quarters—TrendForce expects that some manufacturers might learn preliminary HBM3e results by the end of 2023. However, it’s generally anticipated that major manufacturers will have definite results by 1Q24. Notably, the outcomes will influence NVIDIA’s procurement decisions for 2024, as final evaluations are still underway.
(Photo credit: SK Hynix)