News
Reportedly, South Korean memory giant SK Hynix has capitalized on the surging demand in the artificial intelligence (AI) and high-performance computing (HPC) markets. As per a report by ComputerBase, with its HBM and DDR5 products, SK Hynix has swiftly emerged from the slump in the memory market in 2023 and anticipates further growth. Consequently, a new phase of expansion is underway.
The report further indicates that SK Hynix plans to invest at least KRW 120 trillion (approximately USD 89.4 billion) to construct a new semiconductor production complex in Yongin, located in the central part of Gyeonggi Province, South Korea. This includes four independent fabs, with preparations currently underway, one-third of which has already been completed.
The report indicates that SK Hynix announced plans to build the world’s largest chip production facility as early as 2019. However, due to various reasons, the project was delayed until 2022 when agreements were reached with the central and local governments of South Korea, allowing the project to progress.
SK Hynix intends to commence its expansion project officially in March 2025, with the first fab scheduled for completion in 2027 and the entire complex expected to be completed by 2046. It is yet clear whether the first fab will produce DRAM or NAND Flash memory. However, given the significant demand for HBM products in the AI market, and considering SK Hynix’s tight production capacity, this is likely the direction they will choose.
HBM, a type of DRAM primarily used in AI servers, is experiencing a surge in demand worldwide, led by NVIDIA. Moreover, according to a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
Additionally, the four planned fabs are expected to occupy half of the complex’s size, with SK Hynix also constructing numerous supporting facilities in the area, such as wastewater treatment plants and resource recycling centers. Apart from SK Hynix, Samsung has also opted to construct a similar semiconductor production complex nearby, which includes research and development centers, to meet the anticipated market demands ahead.
Read more
(Photo credit: SK Hynix)
News
Micron, the major memory manufacturer in the United States, has benefited from AI demand, turning losses into profits last quarter (ending in February) and issuing optimistic financial forecasts.
During its earnings call on March 20th, Micron CEO Sanjay Mehrotra stated that the company’s HBM (High Bandwidth Memory) capacity for this year has been fully allocated, with most of next year’s capacity already booked. HBM products are expected to generate hundreds of millions of dollars in revenue for Micron in the current fiscal year.
Per a report from Washington Post, Micron expects revenue for the current quarter (ending in May) to be between USD 6.4 billion and USD 6.8 billion, with a midpoint of USD 6.6 billion, surpassing Wall Street’s expectation of USD 6 billion.
Last quarter, Micron’s revenue surged 58% year-on-year to USD 5.82 billion, exceeding Wall Street’s expectation of USD 5.35 billion. The company posted a net profit of USD 790 million last quarter, a turnaround from a loss of USD 2.3 billion in the same period last year. Excluding one-time charges, Micron’s EPS reached USD 0.42 last quarter. Mehrotra reportedly attributed Micron’s return to profitability last quarter to the company’s efforts in pricing, product, and operational costs.
Over the past year, memory manufacturers have reduced production, coupled with the explosive growth of the AI industry, which has led to a surge in demand for NVIDIA AI processors, benefiting upstream memory manufacturers.
Mehrotra stated, “We believe Micron is one of the biggest beneficiaries in the semiconductor industry of the multiyear opportunity enabled by AI.”
The projected growth rates for DRAM and NAND Flash bit demand in 2024 are close to 15% and in the mid-teens, respectively. However, the supply growth rates for DRAM and NAND Flash bits in 2024 are both lower than the demand growth rates.
Micron utilizes 176 and 232-layer processes for over 90% of its NAND Flash production. As for HBM3e, it is expected to contribute to revenue starting from the second quarter.
Per a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
In terms of capital expenditures, the company maintains an amount of USD 7.5 to USD 8 billion (taking into account U.S. government subsidies), primarily allocated for enhancing HBM-related capacity.
Micron stated that due to the more complex packaging of HBM, it consumes three times the DRAM capacity of DDR5, indirectly constraining the capacity for non-HBM products, thereby improving overall DRAM market supply.
As per Micron’s report, regarding growth outlooks for various end markets in 2024, the annual growth rate for the data center industry has been revised upward from mid-single digits to mid-to-high single digits, while the PC industry’s annual growth rate remains at low to mid-single digits. AI PCs are expected to capture a certain market share in 2025. The annual growth rate for the mobile phone industry has been adjusted upward from modest growth to low to mid-single digits.
Read more
(Photo credit: Micron)
News
Amid the AI boom driving a surge in demand for advanced packaging, South Korean semiconductor giant Samsung Electronics is aggressively entering the advanced packaging arena. On the 20th, it announced its ambitions to achieve record-high revenue in advanced packaging this year, aiming to surpass the USD 100 million mark.
According to reports from Reuters and The Korea Times, Samsung’s annual shareholders’ meeting took place on March 20th.
During the meeting, Han Jong-hee, the vice chairman of the company, stated as follows: “Although the macroeconomic environment is expected to be uncertain this year, we see an opportunity for increased growth through next-generation technology innovation.”
“Samsung plans to apply AI to all devices, including smartphones, foldable devices, accessories and extended reality (XR), to provide customers with a new experience where generative AI and on-device AI unfold,” Han added.
Samsung established the Advanced Package Business Team under the Device Solutions business group in December last year. Samsung Co-CEO Kye-Hyun Kyung stated that he expects the results of Samsung’s investment to come out in earnest from the second half of this year.
Kyung further noted that for a future generation of HBM chips called HBM4, likely to be released in 2025 with more customised designs, Samsung will take advantage of having memory chips, chip contract manufacturing and chip design businesses under one roof to satisfy customer needs.
According to a previous report from TrendForce, Samsung led the pack with the highest revenue growth among the top manufacturers in Q4 as it jumped 50% QoQ to hit $7.95 billion, largely due to a surge in 1alpha nm DDR5 shipments, boosting server DRAM shipments by over 60%. In the fourth quarter of last year, Samsung secured a market share of 45.5% in DRAM chips.
Read more
(Photo credit: Samsung)
News
With numerous cloud computing companies and large-scale AI model manufacturers investing heavily in AI computing infrastructure, the demand for AI processors is rapidly increasing. As per a report from IJIWEI, the demand for HBM (High Bandwidth Memory), a key component among them, has been on the rise as well.
Amid the opportunity brought about by the surge in demand for computing power, which has in turn created a wave of opportunities for storage capabilities, when looking at the entire HBM industry chain, the number of China’s local companies which are able to enter the field is limited.
Faced with significant technological challenges but vast prospects, whether from the perspective of independent controllability or market competition, it is imperative to accelerate the pace of catching up.
HBM Demand Grows Against the Trend, Dominated by Three Giants
The first TSV HBM product debuted in 2014, but it wasn’t until after the release of ChatGPT in 2023 that the robust demand for AI servers drove rapid iterations of HBM technology in the order of HBM1, HBM2, HBM2e, HBM3, and HBM3e.
The fourth-generation HBM3 has been mass-produced and applied, with significant improvements in bandwidth, stack height, capacity, I/O speed, and more compared to the first generation. Currently, only three storage giants—SK Hynix, Samsung Electronics, and Micron—are capable of mass-producing HBM.
According to a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
In 2023, the primary applications in the market were HBM2, HBM2e, and HBM3, with the penetration rate of HBM3 increasing in the latter half of the year due to the push from NVIDIA’s H100 and AMD’s MI300.
According to TrendForce’s report, SK Hynix led the way with its HBM3e validation in the first quarter, closely followed by Micron, which plans to start distributing its HBM3e products toward the end of the first quarter, in alignment with NVIDIA’s planned H200 deployment by the end of the second quarter.
Samsung, slightly behind in sample submissions, is expected to complete its HBM3e validation by the end of the first quarter, with shipments rolling out in the second quarter.
Driven by market demand, major players such as SK Hynix, Samsung, and Micron Technology are increasing their efforts to expand production capacity. SK Hynix revealed in February that all its HBM products had been fully allocated for the year, prompting preparations for 2025 to maintain market leadership.
Reportedly, Samsung, aiming to compete in the 2024 HBM market, plans to increase the maximum production capacity to 150,000 to 170,000 units per month before the end of the fourth quarter of this year. Previously, Samsung also invested KRW 10.5 billion to acquire Samsung Display’s factory and equipment in Cheonan, South Korea, with the aim of expanding HBM production capacity.
Micron Technology CEO Sanjay Mehrotra recently revealed that Micron’s HBM production capacity for 2024 is expected to be fully allocated.
Although the three major HBM suppliers continue to focus on iterating HBM3e, there is still room for improvement in single-die DRAM and stacking layers. However, the development of HBM4 has been put on the agenda.
Trendforce previously predicted that HBM4 will mark the first use of a 12nm process wafer for its bottommost logic die (base die), to be supplied by foundries. This advancement signifies a collaborative effort between foundries and memory suppliers for each HBM product, reflecting the evolving landscape of high-speed memory technology.
Continuous Surge in HBM Demand and Prices, Local Supply Chains in China Catching Up
In the face of a vast market opportunity, aside from the continuous efforts of the three giants to ramp up research and production, some second and third-tier Chinese DRAM manufacturers have also entered the HBM race. With the improvement in the level of locally produced AI processors, the demand for independent HBM supply chains in China has become increasingly urgent.
Top global manufacturers operate DRAM processes at the 1alpha and 1beta levels, while China’s DRAM processes operate at the 25-17nm level. China’s DRAM processes are approaching those overseas, and there are advanced packaging technology resources and GPU customer resources locally, indicating a strong demand for HBM localization. In the future, local DRAM manufacturers in China are reportedly expected to break through into HBM.
It is worth noting that the research and manufacturing of HBM involve complex processes and technical challenges, including wafer-level packaging, testing technology, design compatibility, and more. CoWoS is currently the mainstream packaging solution for AI processors, and in AI chips utilizing CoWoS technology, HBM integration is also incorporated.
CoWoS and HBM involves processes such as TSV (Through-Silicon Via), bumps, microbumps, and RDL (Redistribution Layer). Among these, TSV accounts for the highest proportion of the 3D packaging cost of HBM, close to 30%.
Currently, China has only a few leading packaging companies such as JCET Group, Tongfu Microelectronics, and SJSemi that possess the technology (such as TSV through-silicon via) and equipment required to support HBM production.
However, despite these efforts, the number of Chinese companies truly involved in the HBM industry chain remains limited, with most focusing on upstream materials.
With GPU acquisition restricted, breakthroughs in China’s AI processors are urgently needed both from its own self-sufficiency perspective and in terms of market competition. Therefore, synchronized breakthroughs in HBM are also crucial from Chinese manufacturers.
Read more
(Photo credit: SK Hynix)
News
There are market rumors suggesting that Samsung Electronics plans to switch to the chip manufacturing technology used by SK Hynix in an effort to catch up with competitors in the increasingly heated competition of high-bandwidth memory (HBM).
As per Reuters’ report on March 13th, demand for HBM has surged due to the popularity of Generative AI. However, while SK Hynix and Micron Technology have successively finalized supply agreements with NVIDIA Corp., Samsung unexpectedly missed out. It is reported that Samsung’s HBM3 has yet to pass NVIDIA’s quality tests.
Reuters report further cited sources indicating that one of the reasons for Samsung’s lagging progress is its insistence on using Non-Conductive Film (NCF) technology, which has led to some production issues. In contrast, SK Hynix has taken the lead by switching to mass reflow molded underfill (MR-MUF) technology, addressing the weaknesses of NCF and becoming the first supplier of HBM3 chips to NVIDIA.
The report states that Samsung is in talks with several material suppliers, including Nagase Corporation from Japan, in hopes of purchasing MUF materials. It is revealed that Samsung intends to utilize both NCF and MUF technologies in its latest HBM chips.
Regarding the matter, both of NVIDIA and Nagase declined to comment.
As for the current landscape of the HBM market, starting in 2024, the market’s attention will shift from HBM3 to HBM3e, with expectations for a gradual ramp-up in production through the second half of the year, positioning HBM3e as the new mainstream in the HBM market.
According to TrendForce’s latest report, SK hynix led the way with its HBM3e validation in the first quarter, closely followed by Micron, which plans to start distributing its HBM3e products toward the end of the first quarter, in alignment with NVIDIA’s planned H200 deployment by the end of the second quarter.
Samsung, slightly behind in sample submissions, is expected to complete its HBM3e validation by the end of the first quarter, with shipments rolling out in the second quarter. With Samsung having already made significant strides in HBM3 and its HBM3e validation expected to be completed soon, the company is poised to significantly narrow the market share gap with SK hynix by the end of the year, reshaping the competitive dynamics in the HBM market.
Read more
(Photo credit: SK Hynix)