News
Amidst the AI frenzy, HBM has become a focal point for major semiconductor manufacturers, and another storage giant is making new moves.
According to Korean media “THE ELEC”, Samsung Electronics plans to establish an HBM development office to enhance its competitiveness in the HBM market. The size of the team has not been determined yet, but Samsung’s HBM task force is expected to undergo upgrades.
The report indicates that if the task force is upgraded to a development office, Samsung will then establish specialized design and solution teams for HBM development. The head of the development office will be appointed from among vice president-level personnel.
In terms of research and development progress, the three major manufacturers have all advanced to the stage of HBM3e.
Regarding Samsung, in February, the company just released its first 36GB HBM3e 12H DRAM, which is currently Samsung’s largest capacity HBM product. Presently, Samsung has begun providing samples of HBM3e 12H to customers, with mass production expected to commence in the latter half of this year.
On the other hand, Micron Technology has announced the commencement of mass production of high-frequency memory “HBM3e,” which will be utilized in NVIDIA’s latest AI chip, the “H200” Tensor Core graphics processing unit (GPU). The H200 is scheduled for shipment in the second quarter of 2024.
Another major player, SK Hynix, as per Business Korea, plans to begin mass production of HBM3e in the first half of 2024.
In terms of production capacity, both SK Hynix and Micron Technology have previously disclosed that HBM production capacity is fully allocated. This indicates a strong market demand for HBM, reaffirming manufacturers’ determination to expand production.
As per previous report by Bloomberg, SK Hynix plans to invest an additional USD 1 billion in advanced packaging for HBM. The company stated in its recent financial report that it intends to increase capital expenditures in 2024 and shift production focus to high-end storage products such as HBM.
The capacity for HBM is expected to more than double compared to last year. From a demand perspective, it is anticipated that over 60% of future demand for HBM will stem from the primary application of AI servers.
Read more
(Photo credit: Samsung)
News
South Korean memory giant SK Hynix is significantly investing in advanced chip packaging, aiming to capture more demand for High Bandwidth Memory (HBM), a vital component driving the burgeoning AI market.
According to Bloomberg’s report, Lee Kang-Wook, currently leading SK Hynix’s packaging research and development, stated that the company is investing over USD 1 billion in South Korea to expand and enhance the final steps of its chip manufacturing process.
“The first 50 years of the semiconductor industry has been about the front-end, or the design and fabrication of the chips themselves,” Lee Kang-Wook expressed in an interview with Bloomberg. “But the next 50 years is going to be all about the back-end, or packaging.”
The same report further indicates that the packaging upgrade will help reduce power consumption, enhance performance, and maintain SK Hynix’s leadership position in the HBM market.
Recent market trends also highlight the crucial role of advanced packaging in the manufacturing of HBM products. According to a recent report by South Korean media DealSite, the complex architecture of HBM has resulted in difficulties for manufacturers like Micron and SK Hynix to meet NVIDIA’s testing standards.
The yield of HBM is closely tied to the complexity of its stacking architecture, which involves multiple memory layers and Through-Silicon Via (TSV) technology for inter-layer connections. These intricate techniques increase the probability of process defects, potentially leading to lower yields compared to simpler memory designs.
The reason lies in the lower yield of HBM chips compared to traditional memory chips. The complex stacking architecture of HBM involves multiple memory layers and Through-Silicon Via (TSV) technology for interconnecting layers, which increases manufacturing complexity. In the multi-layer stacking of HBM, if any of the HBM chips are defective, the entire stack will be discarded.
HBM, a type of DRAM primarily used in AI servers, is experiencing a surge in demand worldwide, led by NVIDIA. Moreover, according to a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
Read more
(Photo credit: SK Hynix)
News
The surge in demand for NVIDIA’s AI processors has made High Bandwidth Memory (HBM) a key product that memory giants are eager to develop. However, according to South Korean media DealSite cited by Wccftech on March 4th, the complex architecture of HBM has resulted in low yields, making it difficult to meet NVIDIA’s testing standards and raising concerns about limited production capacity.
The report has further pointed out that HBM manufacturers like Micron and SK Hynix are grappling with low yields. They are engaged in fierce competition to pass NVIDIA’s quality tests for the next-generation AI GPU.
The yield of HBM is closely tied to the complexity of its stacking architecture, which involves multiple memory layers and Through-Silicon Via (TSV) technology for inter-layer connections. These intricate techniques increase the probability of process defects, potentially leading to lower yields compared to simpler memory designs.
Furthermore, if any of the HBM chips are defective, the entire stack is discarded, resulting in inherently low production yields. As per the source cited by Wccftech, it has indicated that the overall yield of HBM currently stands at around 65%, and attempts to improve yield may result in decreased production volume.
Micron announced on February 26th the commencement of mass production of High Bandwidth Memory “HBM3e,” to be used in NVIDIA’s latest AI chip “H200” Tensor Core GPU. The H200 is scheduled for shipment in the second quarter of 2024, replacing the current most powerful H100.
On the other hand, Kim Ki-tae, Vice President of SK Hynix, stated on February 21st in an official blog post that while external uncertainties persist, the memory market is expected to gradually heat up this year. Reasons include the recovery in product demand from global tech giants. Additionally, the application of AI in devices such as PCs or smartphones is expected to increase demand not only for HBM3e but also for products like DDR5 and LPDDR5T.
Kim Ki-tae pointed out that all of their HBM inventory has been sold out this year. Although it’s just the beginning of 2024, the company has already begun preparations for 2025 to maintain its market-leading position.
Per a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
Read more
(Photo credit: SK Hynix)
News
South Korean memory giant SK Hynix is reportedly exploring a collaboration with Japanese NAND flash memory manufacturer Kioxia to produce High Bandwidth Memory (HBM) for AI applications, as per MoneyDJ citing Jiji Press.
According to Jiji Press’ report on March 1st, it is estimated that production will take place at the Japanese plant jointly operated by Kioxia and Western Digital (WD). Kioxia, on the other hand, will evaluate the proposed collaboration based on semiconductor market conditions and its relationship with WD.
The report highlights that HBM, a type of DRAM primarily used in AI servers, is experiencing a surge in demand worldwide, led by NVIDIA. Moreover, according to a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.
For SK Hynix, leveraging Kioxia’s existing plants in Kitakami, Iwate Prefecture, and Yokkaichi, Mie Prefecture, Japan, to produce HBM would enable the rapid establishment of an expanded production system.
Meanwhile, the joint-operated Japanese plants of Kioxia and WD currently only produce NAND Flash. If they were to produce the most advanced DRAM in the future, it would also contribute to Japan’s semiconductor industry revitalization plan.
The report further addresses that SK Hynix has indirectly invested approximately 15% in Kioxia through Bain Capital, a U.S.-based investment firm. Bain Capital is reportedly negotiating with SK Hynix behind the scenes, seeking to revive the Kioxia/WD merger. However, as per sources cited in Jiji Press’ report, “this collaboration and the merger are two separate discussion matters.”
According to a previous report from Asahi News on February 23, Kioxia and WD are expected to restart merger negotiations at the end of April. Although the merger negotiations between the two parties hit a roadblock last autumn, both are facing pressure to expand their scale for survival. However, whether the two parties can ultimately reach a merger agreement remains uncertain.
As per TrendForce’s data for 3Q23, Samsung maintained its position as the top global NAND flash memory manufacturer, commanding a significant market share of 31.4%. Following closely, SK Group secured the second position with a 20.2% market share. Western Digital occupied the third position with a market share of 16.9%, while Japan’s Kioxia held a 14.5% market share.
Asahi News further indicates that if Kioxia and WD, the 2 companies which all manufacture NAND Flash products are to merge, their scale will rival that of the global market leader, Samsung Electronics.
The Japanese government reportedly views the Kioxia/WD merger as a “symbol” of Japan-US semiconductor cooperation and has provided support. However, the merger negotiations hit an impasse last fall, reportedly due to opposition from SK Hynix, indirectly invested in Kioxia.
Read more
(Photo credit: Kioxia)
News
South Korean memory giant SK Hynix has confirmed record-breaking sales of High Bandwidth Memory (HBM) over the past few months, driving profitability in the fourth quarter and predicting an industry-wide recovery.
According to Wccftech, SK Hynix Vice President Kim Ki-tae stated on February 21st that the demand for HBM, as an AI memory solution, is experiencing explosive growth as generative AI services become increasingly diverse and continue to evolve.
The report has cited insights from Kim Ki-tae, who stated, “HBM, with its high-performance and high-capacity characteristics, is a monumental product that shakes the conventional wisdom that memory semiconductors are only a part of the overall system. ”
Kim Ki-tae also mentioned that despite ongoing external uncertainties, the memory market is expected to gradually warm up in 2024. This is attributed to the recovery in product demand from global tech giants.
Moreover, AI devices such as PCs or smartphones are expected to increase the demand for artificial intelligence. This surge is anticipated to boost the sales of HBM3E and potentially drive up the demand for products like DDR5 and LPDDR5T.
Kim Ki-tae emphasized that their HBM products have already sold out for this year. Although it’s just the beginning of 2024, the company has already begun gearing up for 2025.
SK Hynix Plans to Establish Advanced Packaging Plant in the US
SK Hynix is reportedly set to establish an advanced packaging plant in Indiana, with the US government aiming to reduce dependence on advanced chips from Taiwan.
As per the Financial Times on February 1st, citing unnamed sources, SK Hynix’s rumored new packaging facility in Indiana may specialize in 3D stacking processes to produce HBM, which will also be integrated into NVIDIA’s GPUs.
Currently, SK Hynix produces HBM in South Korea and then ships it to Taiwan for integration into NVIDIA GPUs by TSMC.
Read more
(Photo credit: SK Hynix)