HBM3e


2024-08-15

[News] Samsung Likely Emerges as the Pacemaker for the AI Market if It Secures HBM3e Supply to NVIDIA

Samsung Electronics, which has been struggling at the final stage of its HBM3e qualification with NVIDIA, may unexpectedly emerge as the pacemaker for the AI ecosystem, as the company may somehow ease the cost pressure for building AI servers by balancing the market, as well as alleviating the tight HBM supply, according to a recent report by Korean media outlet Invest Chosun.

Samsung, in its second quarter earnings call, has confirmed that the company’s fifth-generation 8-layer HBM3e is undergoing customer valuation. The product is reportedly to enter mass production as soon as the third quarter.

Invest Chosun analyzes that while there is growing anticipation that NVIDIA could come up with a conclusion regarding Samsung’s HBM3e verification, the market’s attitude towards AI has also been gradually shifting in the meantime, as the main concern now is that semiconductors are becoming too expensive.

The report, citing remarks from a consultant, notes that the price of an NVIDIA chip may cost tens of thousands of dollars each, leading to concerns that the industry’s overall investment capex cycle might not last more than three years.

In addition, the report highlights that the cost of building an AI server for learning is about 40 times that of a standard server, with over 80% attributed to NVIDIA’s AI accelerators. Due to the cost pressure, big techs have been closely examining the cost structure for building AI servers.

Therefore, NVIDIA has to take its customers’ budgets into consideration when planning its roadmap. The move has also sparked speculation that NVIDIA, which is prompted to lower product prices, might compromise to bring Samsung onboard as an HBM3e supplier, the report states.

Citing an industry insider, the report highlights the dilemma of NVIDIA and its HBM suppliers. As the AI giant tries to shorten its product cycle, releasing the Blackwell (B100) series just two years after the Hopper (H100), HBM suppliers have been struggling except for SK hynix, as the company is the only one with the most experience.

If Samsung doesn’t join the HBM lineup, the overall supply of NVIDIA’s AI accelerators could be limited, driving prices even higher, the report suggests.

Under this backdrop, Samsung may have taken on the role of pacemaker in the AI semiconductor market, as it may help balance the market during a time when there are concerns about overheating in the AI industry. Also, if it is able to form a strong collaboration with NVIDIA by supplying 8-layer HBM3e, its technological gap with competitors will noticeably narrow.

TrendForce notes that Samsung’s recent progress on HBM3e qualification seems to be solid, and we can soon expect both 8hi and 12hi to be qualified in the near future. The company is eager to gain higher HBM market share from SK hynix so its 1alpha capacity has reserved for HBM3e. TrendForce believes that Samsung is going to be a very important supplier on HBM category.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Invest Chosun.
2024-08-12

[News] Korea’s Memory Exports to Taiwan Surge 225% in 1H 2024 Driven by Strong HBM Demand

According to a report from Korea media outlet Yonhap News Agency, South Korea’s memory export to Taiwan has surged by 225% in the first half of the year.

The primary driver of this increase is reportedly due to South Korean chipmaker SK hynix’s supply of HBM to U.S. AI chip giant NVIDIA, which packages its AI accelerators at Taiwan’s TSMC.

A researcher at the Korea Institutes for Industrial Economics & Trade, Kim Yang-paeng, also noted that the sharp increase in exports is likely related to SK hynix’s supplies for TSMC’s final packaging of AI accelerators.

The report from Economic Daily News further highlights the strong momentum in NVIDIA’s AI chip shipments, with TSMC, as the key manufacturing partner, receiving steady advanced process orders.

The report from Yonhap News Agency also cited data from the industry ministry and the Korea International Trade Association released on August 11th, showing that South Korea’s memory exports to Taiwan in the first half of the year grew by 225.7% year-on-year, reaching USD 4.26 billion.

This growth significantly outpaces the overall increase in South Korea’s memory exports, which was 88.7%. Additionally, Taiwan has become South Korea’s third-largest market for memory exports in the first half of the year, climbing two spots to surpass Vietnam and the United States.

Another Korean media outlet, The Korea Herald, noted that since the 2010s, South Korea’s annual memory exports to Taiwan have ranged between USD 1 billion and 4 billion. The latest data indicates that this year’s export volume may set a new record, potentially reaching USD 8 billion.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Yonhap News AgencyEconomic Daily News and The Korea Herald.
2024-08-08

[News] SK hynix CEO: Demand for Memory Chips to Remain Robust until 1H25, Driven by HBM

As the demand for memory chips used in AI remains strong, prompting major memory companies to accelerate their pace on HBM3e and HBM4 qualification, SK hynix CEO Kwak Noh-jung stated on August 7 that driven by the high demand for memory chips like high-bandwidth memory (HBM), the market is expected to stay robust until the first half of 2025, according to a report by the Korea Economic Daily.

However, Kwak noted that the momentum beyond 2H25 “remains to be seen,” indicating that the company needs to study market conditions and the situation of supply and demand before making comments further. SK hynix clarified that was not an indication of a possible downturn.

According to the analysis by TrendForce, HBM’s share of total DRAM bit capacity is estimated to rise from 2% in 2023 to 5% in 2024 and surpass 10% by 2025. In terms of market value, HBM is projected to account for more than 20% of the total DRAM market value starting in 2024, potentially exceeding 30% by 2025.

SK hynix, as the current HBM market leader, said earlier in its earnings call in July that its HBM3e shipment is expected to surpass that of HBM3 in the third quarter, with HBM3e accounting for more than half of the total HBM shipments in 2024. In addition, it expects to begin supplying 12-layer HBM3e products to customers in the fourth quarter.

The report notes that for now, the company’s major focus would be on the sixth-generation HBM chips, HBM4, which is under development in collaboration with foundry giant TSMC. Its 12-layer HBM4 is expected to be launched in the second half of next year, according to the report.

Samsung, on the other hand, had been working since last year to become a supplier of NVIDIA’s HBM3 and HBM3e. In late July, it is said that Samsung’s HBM3 has passed NVIDIA’s qualification, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls. On August 6, the company denied rumors that its 8-layer HBM3e chips had cleared NVIDIA’s tests.

Notably, per a previous report from the South Korean newspaper Korea Joongang Daily, following Micron’s initiation of mass production of HBM3e in February 2024, it has recently secured an order from NVIDIA for the H200 AI GPU.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from the Korea Economic Daily and Korea Joongang Daily.
2024-08-07

[News] Market Rumors Suggest Samsung’s HBM3e Passed NVIDIA Test, Though Samsung Denies

According to a report from Reuters citing industry sources, Samsung Electronics’ fifth-generation high-bandwidth memory (HBM3e) have passed tests by NVIDIA and could be used in NVIDIA’s AI processors.

The report further indicates that while no supply contract has been signed yet, one is expected soon, with potential deliveries starting in the fourth quarter of this year. The news also notes that the tested HBM3e chips are 8-layer, while Samsung’s 12-layer HBM3e have yet passed test.

However, in response to the matter, Samsung Electronics stated in a report from BusinessKorea on August 7 that they could not confirm stories related to their customers and that the report was not true.

The Samsung Electronics official cited by BusinessKorea also mentioned that, as previously stated during a conference call last month, the quality testing is still ongoing and there have been no updates since then.

Samsung had been working since last year to become a supplier of NVIDIA’s HBM3 and HBM3e. In late July, it is said that Samsung’s HBM3 has passed NVIDIA’s qualification, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Reuters and BusinessKorea.
2024-08-07

[News] SK hynix Secures up to USD 450 Million Funding for Indiana Packaging Facility under CHIPS Act

SK hynix, the current High Bandwidth Memory (HBM) market leader, announced on August 6th that it has signed a non-binding preliminary memorandum of terms with the U.S. Department of Commerce to receive up to USD 450 million in proposed direct funding and access to proposed loans of USD 500 million as part of the CHIPS and Science Act. The funding, according to its press release, will be used to build a production base for semiconductor packaging in Indiana.

Earlier in April, the other two memory giants, Samsung and Micron, have secured funds under the CHIPS and Science Act as well, receiving USD 6.4 billion and USD 6.1 billion, respectively.

SK hynix also noted in its press release that it plans to seek from the U.S. Department of the Treasury a tax benefit equivalent of up to 25% of the qualified capital expenditures through the Investment Tax Credit program.

The South Korean memory chip maker also said that it will proceed with the construction of the Indiana production base as planned to provide AI memory products. Through this, it looks forward to contributing to build a more resilient supply chain of the global semiconductor industry.

The signing follows SK hynix’s announcement in April that it plans to invest USD 3.87 billion to build a production base for advanced packaging in Indiana in a move expected to create around 1,000 jobs. According to a previous report by The Wall Street Journal, the advanced packaging fab it is expected to commence operations by 2028.

As the major HBM supplier of AI giant NVIDIA, SK hynix has good reason to accelerate the pace of capacity expansion. The recent NVIDIA Blackwell B200, with each GPU utilizing 8 HBM3e chips, has also underscored SK hynix’s role in the critical components supply chain for the AI industry.

On the other hand, a week earlier, semiconductor equipment leader Applied Materials was said to be rejected for funding under the CHIPS act for a R&D center in Silicon Valley, which targets to develop next-generation chipmaking tools. It has tried to gain U.S. funding for a USD 4 billion facility in Sunnyvale, California, which was slated to be completed in 2026.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from SK hynix and The Wall Street Journal.
  • Page 4
  • 13 page(s)
  • 62 result(s)

Get in touch with us