News
There are market rumors suggesting that Samsung Electronics plans to switch to the chip manufacturing technology used by SK Hynix in an effort to catch up with competitors in the increasingly heated competition of high-bandwidth memory (HBM).
As per Reuters’ report on March 13th, demand for HBM has surged due to the popularity of Generative AI. However, while SK Hynix and Micron Technology have successively finalized supply agreements with NVIDIA Corp., Samsung unexpectedly missed out. It is reported that Samsung’s HBM3 has yet to pass NVIDIA’s quality tests.
Reuters report further cited sources indicating that one of the reasons for Samsung’s lagging progress is its insistence on using Non-Conductive Film (NCF) technology, which has led to some production issues. In contrast, SK Hynix has taken the lead by switching to mass reflow molded underfill (MR-MUF) technology, addressing the weaknesses of NCF and becoming the first supplier of HBM3 chips to NVIDIA.
The report states that Samsung is in talks with several material suppliers, including Nagase Corporation from Japan, in hopes of purchasing MUF materials. It is revealed that Samsung intends to utilize both NCF and MUF technologies in its latest HBM chips.
Regarding the matter, both of NVIDIA and Nagase declined to comment.
As for the current landscape of the HBM market, starting in 2024, the market’s attention will shift from HBM3 to HBM3e, with expectations for a gradual ramp-up in production through the second half of the year, positioning HBM3e as the new mainstream in the HBM market.
According to TrendForce’s latest report, SK hynix led the way with its HBM3e validation in the first quarter, closely followed by Micron, which plans to start distributing its HBM3e products toward the end of the first quarter, in alignment with NVIDIA’s planned H200 deployment by the end of the second quarter.
Samsung, slightly behind in sample submissions, is expected to complete its HBM3e validation by the end of the first quarter, with shipments rolling out in the second quarter. With Samsung having already made significant strides in HBM3 and its HBM3e validation expected to be completed soon, the company is poised to significantly narrow the market share gap with SK hynix by the end of the year, reshaping the competitive dynamics in the HBM market.
Read more
(Photo credit: SK Hynix)
News
Amidst the AI frenzy, HBM has become a focal point for major semiconductor manufacturers, and another storage giant is making new moves.
According to Korean media “THE ELEC”, Samsung Electronics plans to establish an HBM development office to enhance its competitiveness in the HBM market. The size of the team has not been determined yet, but Samsung’s HBM task force is expected to undergo upgrades.
The report indicates that if the task force is upgraded to a development office, Samsung will then establish specialized design and solution teams for HBM development. The head of the development office will be appointed from among vice president-level personnel.
In terms of research and development progress, the three major manufacturers have all advanced to the stage of HBM3e.
Regarding Samsung, in February, the company just released its first 36GB HBM3e 12H DRAM, which is currently Samsung’s largest capacity HBM product. Presently, Samsung has begun providing samples of HBM3e 12H to customers, with mass production expected to commence in the latter half of this year.
On the other hand, Micron Technology has announced the commencement of mass production of high-frequency memory “HBM3e,” which will be utilized in NVIDIA’s latest AI chip, the “H200” Tensor Core graphics processing unit (GPU). The H200 is scheduled for shipment in the second quarter of 2024.
Another major player, SK Hynix, as per Business Korea, plans to begin mass production of HBM3e in the first half of 2024.
In terms of production capacity, both SK Hynix and Micron Technology have previously disclosed that HBM production capacity is fully allocated. This indicates a strong market demand for HBM, reaffirming manufacturers’ determination to expand production.
As per previous report by Bloomberg, SK Hynix plans to invest an additional USD 1 billion in advanced packaging for HBM. The company stated in its recent financial report that it intends to increase capital expenditures in 2024 and shift production focus to high-end storage products such as HBM.
The capacity for HBM is expected to more than double compared to last year. From a demand perspective, it is anticipated that over 60% of future demand for HBM will stem from the primary application of AI servers.
Read more
(Photo credit: Samsung)
News
South Korean memory giant SK Hynix is significantly investing in advanced chip packaging, aiming to capture more demand for High Bandwidth Memory (HBM), a vital component driving the burgeoning AI market.
According to Bloomberg’s report, Lee Kang-Wook, currently leading SK Hynix’s packaging research and development, stated that the company is investing over USD 1 billion in South Korea to expand and enhance the final steps of its chip manufacturing process.
“The first 50 years of the semiconductor industry has been about the front-end, or the design and fabrication of the chips themselves,” Lee Kang-Wook expressed in an interview with Bloomberg. “But the next 50 years is going to be all about the back-end, or packaging.”
The same report further indicates that the packaging upgrade will help reduce power consumption, enhance performance, and maintain SK Hynix’s leadership position in the HBM market.
Recent market trends also highlight the crucial role of advanced packaging in the manufacturing of HBM products. According to a recent report by South Korean media DealSite, the complex architecture of HBM has resulted in difficulties for manufacturers like Micron and SK Hynix to meet NVIDIA’s testing standards.
The yield of HBM is closely tied to the complexity of its stacking architecture, which involves multiple memory layers and Through-Silicon Via (TSV) technology for inter-layer connections. These intricate techniques increase the probability of process defects, potentially leading to lower yields compared to simpler memory designs.
The reason lies in the lower yield of HBM chips compared to traditional memory chips. The complex stacking architecture of HBM involves multiple memory layers and Through-Silicon Via (TSV) technology for interconnecting layers, which increases manufacturing complexity. In the multi-layer stacking of HBM, if any of the HBM chips are defective, the entire stack will be discarded.
HBM, a type of DRAM primarily used in AI servers, is experiencing a surge in demand worldwide, led by NVIDIA. Moreover, according to a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
Read more
(Photo credit: SK Hynix)
News
The surge in demand for NVIDIA’s AI processors has made High Bandwidth Memory (HBM) a key product that memory giants are eager to develop. However, according to South Korean media DealSite cited by Wccftech on March 4th, the complex architecture of HBM has resulted in low yields, making it difficult to meet NVIDIA’s testing standards and raising concerns about limited production capacity.
The report has further pointed out that HBM manufacturers like Micron and SK Hynix are grappling with low yields. They are engaged in fierce competition to pass NVIDIA’s quality tests for the next-generation AI GPU.
The yield of HBM is closely tied to the complexity of its stacking architecture, which involves multiple memory layers and Through-Silicon Via (TSV) technology for inter-layer connections. These intricate techniques increase the probability of process defects, potentially leading to lower yields compared to simpler memory designs.
Furthermore, if any of the HBM chips are defective, the entire stack is discarded, resulting in inherently low production yields. As per the source cited by Wccftech, it has indicated that the overall yield of HBM currently stands at around 65%, and attempts to improve yield may result in decreased production volume.
Micron announced on February 26th the commencement of mass production of High Bandwidth Memory “HBM3e,” to be used in NVIDIA’s latest AI chip “H200” Tensor Core GPU. The H200 is scheduled for shipment in the second quarter of 2024, replacing the current most powerful H100.
On the other hand, Kim Ki-tae, Vice President of SK Hynix, stated on February 21st in an official blog post that while external uncertainties persist, the memory market is expected to gradually heat up this year. Reasons include the recovery in product demand from global tech giants. Additionally, the application of AI in devices such as PCs or smartphones is expected to increase demand not only for HBM3e but also for products like DDR5 and LPDDR5T.
Kim Ki-tae pointed out that all of their HBM inventory has been sold out this year. Although it’s just the beginning of 2024, the company has already begun preparations for 2025 to maintain its market-leading position.
Per a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
Read more
(Photo credit: SK Hynix)
News
South Korean memory giant SK Hynix is reportedly exploring a collaboration with Japanese NAND flash memory manufacturer Kioxia to produce High Bandwidth Memory (HBM) for AI applications, as per MoneyDJ citing Jiji Press.
According to Jiji Press’ report on March 1st, it is estimated that production will take place at the Japanese plant jointly operated by Kioxia and Western Digital (WD). Kioxia, on the other hand, will evaluate the proposed collaboration based on semiconductor market conditions and its relationship with WD.
The report highlights that HBM, a type of DRAM primarily used in AI servers, is experiencing a surge in demand worldwide, led by NVIDIA. Moreover, according to a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
For SK Hynix, leveraging Kioxia’s existing plants in Kitakami, Iwate Prefecture, and Yokkaichi, Mie Prefecture, Japan, to produce HBM would enable the rapid establishment of an expanded production system.
Meanwhile, the joint-operated Japanese plants of Kioxia and WD currently only produce NAND Flash. If they were to produce the most advanced DRAM in the future, it would also contribute to Japan’s semiconductor industry revitalization plan.
The report further addresses that SK Hynix has indirectly invested approximately 15% in Kioxia through Bain Capital, a U.S.-based investment firm. Bain Capital is reportedly negotiating with SK Hynix behind the scenes, seeking to revive the Kioxia/WD merger. However, as per sources cited in Jiji Press’ report, “this collaboration and the merger are two separate discussion matters.”
According to a previous report from Asahi News on February 23, Kioxia and WD are expected to restart merger negotiations at the end of April. Although the merger negotiations between the two parties hit a roadblock last autumn, both are facing pressure to expand their scale for survival. However, whether the two parties can ultimately reach a merger agreement remains uncertain.
As per TrendForce’s data for 3Q23, Samsung maintained its position as the top global NAND flash memory manufacturer, commanding a significant market share of 31.4%. Following closely, SK Group secured the second position with a 20.2% market share. Western Digital occupied the third position with a market share of 16.9%, while Japan’s Kioxia held a 14.5% market share.
Asahi News further indicates that if Kioxia and WD, the 2 companies which all manufacture NAND Flash products are to merge, their scale will rival that of the global market leader, Samsung Electronics.
The Japanese government reportedly views the Kioxia/WD merger as a “symbol” of Japan-US semiconductor cooperation and has provided support. However, the merger negotiations hit an impasse last fall, reportedly due to opposition from SK Hynix, indirectly invested in Kioxia.
Read more
(Photo credit: Kioxia)