News
Samsung’s HBM, according to a report from TechNews, has yet to pass certification by GPU giant NVIDIA, causing it to fall behind its competitor SK Hynix. As a result, the head of Samsung’s semiconductor division was replaced. Although Samsung denies any issues with their HBM and emphasizes close collaboration with partners, TechNews, citing market sources, indicates that Samsung has indeed suffered a setback.
Samsung invested early in HBM development and collaborated with NVIDIA on HBM and HBM2, but sales were modest. Eventually, the HBM team, according to TechNews’ report, moved to SK Hynix to develop HBM products. Unexpectedly, the surge in generative AI led to a sharp increase in HBM demand, and SK Hynix, benefitting from the trend, seized the opportunity with the help of the team.
Yet, in response to the rumors about changes in the HBM team, SK Hynix has denied the claims that SK Hynix developed HBM with the help of the Samsung team and also denied the information that Samsung’s HBM team transferred to SK Hynix. SK Hynix further emphasized the fact that SK Hynix’s HBM was developed solely by its own engineers.
Samsung’s misfortune is evident; despite years of effort, they faced setbacks just as the market took off. Samsung must now find alternative ways to catch up. The market still needs Samsung, as noted by Wallace C. Kou, President of memory IC design giant Silicon Motion.
Kou reportedly stated that Samsung remains the largest memory producer, and as NVIDIA faces a supply shortage for AI chips, the GPU giant is keen to cooperate with more suppliers. Therefore, it’s only a matter of time before Samsung supplies HBM to NVIDIA.
Furthermore, Samsung also indicated in a recent statement, addressing that they are conducting HBM tests with multiple partners to ensure quality and reliability.
In the statement, Samsung indicates that it is in the process of optimizing their products through close collaboration with its customers, with testing proceeding smoothly and as planned. As HBM is a customized memory product, it requires optimization processes in line with customers’ needs.
Samsung also states that it is currently partnering closely with various companies to continuously test technology and performance, and to thoroughly verify the quality and performance of its HBM.
On the other hand, NVIDIA has various GPUs adopting HBM3e, including H200, B200, B100, and GB200. Although all of them require HBM3e stacking, their power consumption and heat dissipation requirements differ. Samsung’s HBM3e may be more suitable for H200, B200, and AMD Instinct MI350X.
Read more
(Photo credit: SK Hynix)
News
According to a report from a Japanese media outlet The Daily Industrial News, it reported that Micron Technology plans to build a new plant in Hiroshima Prefecture, Japan, for the production of DRAM chips, aiming to begin operations as early as the end of 2027.
The report estimates the total investment to be between JPY 600 billion and 800 billion (roughly USD 5.1 billion). Construction of the new plant is scheduled to begin in early 2026, with the installation of extreme ultraviolet (EUV) lithography equipment.
The Japanese government has approved subsidies of up to JPY 192 billion (roughly USD 1.3 billion) to support Micron’s production of next-generation chips at its Hiroshima plant. The Ministry of Economy, Trade and Industry stated last year that this funding would help Micron incorporate ASML’s EUV equipment, with these chips being crucial for powering generative AI, data centers, and autonomous driving technology.
Micron initially planned to have the new plant operational by 2024, but this schedule has evidently been adjusted due to unfavorable market conditions. Micron, which acquired Japanese DRAM giant Elpida in 2013, employs over 4,000 engineers and technicians in Japan.
Beyond 2025, Japan is set to witness the emergence of several new plants, including Micron Technology’s new 1-gamma (1γ) DRAM production facility in Hiroshima Prefecture.
JSMC, a foundry subsidiary of Powerchip Semiconductor Manufacturing Corporation (PSMC), is collaborating with Japan’s financial group SBI to complete construction by 2027 and begin chip production thereafter.
Additionally, Japanese semiconductor startup Rapidus plans to commence production of 2-nanometer chips in Hokkaido by 2027.
Japan’s resurgence in the semiconductor arena is palpable, with the Ministry of Economy, Trade, and Industry fostering multi-faceted collaborations with the private sector. With a favorable exchange rate policy aiding factory construction and investments, the future looks bright for exports.
However, the looming shortage of semiconductor talent in Japan is a concern. In response, there are generous subsidy programs for talent development.
Read more
(Photo credit: Micron)
News
According to a report by Nikkei News, SK Hynix is considering expanding its investment to Japan and the US to increase HBM production and meet customer demand.
Reportedly, the demand for high-bandwidth memory (HBM) is surging thanks to the AI boom. SK Group Chairman and CEO Chey Tae-won stated at the Future of Asia forum in Tokyo on May 23rd that if overseas investment becomes necessary, the company would consider manufacturing these products in Japan and the United States.
Chey Tae-won also mentioned that SK will further strengthen its partnerships with Japanese chip manufacturing equipment makers and materials suppliers, considering increased investments in Japan. He emphasized that collaboration with Japanese suppliers is crucial for advanced semiconductor manufacturing.
When selecting chip manufacturing sites, Chey highlighted the importance of accessing clean energy, as customers are demanding significant reductions in supply chain greenhouse gas emissions.
Additionally, Chey stated that SK intends to enhance R&D collaboration with Japanese partners for next-generation semiconductor products.
Kwon Jae-soon, a senior executive at SK Hynix, stated in a report published by the Financial Times on May 21 that the yield rate of their HBM3e is approaching the 80% target, and the production time has been reduced by 50%.
Kwon emphasized that the company’s goal this year is to produce 8-layer stacked HBM3e, as this is what customers need the most. He noted that improving yield rates is becoming increasingly important to maintain a leading position in the AI era.
SK Hynix’s HBM capacity is almost fully booked through next year. The company plans to collaborate with TSMC to mass-produce more advanced HBM4 chips starting next year.
Read more
(Photo credit: SK Hynix)
News
Samsung’s latest high bandwidth memory (HBM) chips have reportedly failed Nvidia’s tests, while the reasons were revealed for the first time. According to the latest report by Reuters, the failure was said to be due to issues with heat and power consumption.
Citing sources familiar with the matter, Reuters noted that Samsung’s HBM3 chips, as well as its next generation HBM3e chips, may be affected, which the company and its competitors, SK hynix and Micron, plan to launch later this year.
In response to the concerns raising by heat and power consumption regarding HBM chips, Samsung stated that its HBM testing proceeds as planned.
In an offical statement, Samsung noted that it is in the process of optimizing products through close collaboration with customers, with testing proceeding smoothly and as planned. The company said that HBM is a customized memory product, which requires optimization processes in tandem with customers’ needs.
According to Samsung, the tech giant is currently partnering closely with various companies to continuously test technology and performance, and to thoroughly verify the quality and performance of HBM.
Nvidia, on the other hand, declined to comment.
As Nvidia currently dominates the global GPU market with an 80% lion’s share for AI applications, meeting Nvidia’s stardards would doubtlessly be critical for HBM manufacturers.
Reuters reported that Samsung has been attempting to pass Nvidia’s tests for HBM3 and HBM3e since last year, while a test for Samsung’s 8-layer and 12-layer HBM3e chips was said to fail in April.
According to TrendForce’s analysis earlier, NVIDIA’s upcoming B100 or H200 models will incorporate advanced HBM3e, while the current HBM3 supply for NVIDIA’s H100 solution is primarily met by SK hynix. SK hynix has been providing HBM3 chips to Nvidia since 2022, Reuters noted.
According to a report from the Financial Times in May, SK hynix has successfully reduced the time needed for mass production of HBM3e chips by 50%, while close to achieving the target yield of 80%.
Another US memory giant, Micron, stated in February that its HBM3e consumes 30% less power than its competitors, meeting the demands of generative AI applications. Moreover, the company’s 24GB 8H HBM3e will be part of NVIDIA’s H200 Tensor Core GPUs, breaking the previous exclusivity of SK hynix as the sole supplier for the H100.
Considering major competitors’ progress on HBM3e, if Samsung fails to meet Nvidia’s requirements, the industry and investors may be more concerned on whether the Korean tech heavyweight would further fall behind its rivals in the HBM market.
(Photo credit: Samsung)
News
SK hynix has disclosed yield details regarding the company’s 5th generation High Bandwidth Memory (HBM), HBM3e, for the first time. According to a report from the Financial Times, citing Kwon Jae-soon, the head of yield at SK hynix, the memory giant has successfully reduced the time needed for mass production of HBM3e chips by 50%, while close to achieving the target yield of 80%.
This is better than the industry’s previous speculation, which estimated the yield of SK Hynix’s HBM3e to be between 60% and 70%, according to a report by Business Korea.
According to TrendForce’s analysis earlier, NVIDIA’s upcoming B100 or H200 models will incorporate advanced HBM3e, while the current HBM3 supply for NVIDIA’s H100 solution is primarily met by SK hynix, leading to a supply shortfall in meeting burgeoning AI market demands.
The challenge, however, is the supply bottleneck caused by both CoWoS packaging constraints and the inherently long production cycle of HBM—extending the timeline from wafer initiation to the final product beyond two quarters.
The report by Business Korea noted that HBM manufacturing involves stacking multiple DRAMs vertically, which presents greater process complexity compared to standard DRAM. Specifically, the yield of the silicon via (TSV), a critical process of HBM3e, has been low, ranging from 40% to 60%, posing a significant challenge for improvement.
In terms of SK hynix’s future roadmap for HBM, CEO Kwak Noh-Jung announced on May 2nd that the company’s HBM capacity for 2024 and 2025 has almost been fully sold out. According to Business Korea, SK hynix commenced delivery of 8-layer HBM3e products in March and plans to supply 12-layer HBM3e products in the third quarter of this year. The 12-layer HBM4 (sixth-generation) is scheduled for next year, with the 16-layer version expected to enter production by 2026.
Read more
(Photo credit: SK hynix)