HBM


2024-08-12

[News] Korea’s Memory Exports to Taiwan Surge 225% in 1H 2024 Driven by Strong HBM Demand

According to a report from Korea media outlet Yonhap News Agency, South Korea’s memory export to Taiwan has surged by 225% in the first half of the year.

The primary driver of this increase is reportedly due to South Korean chipmaker SK hynix’s supply of HBM to U.S. AI chip giant NVIDIA, which packages its AI accelerators at Taiwan’s TSMC.

A researcher at the Korea Institutes for Industrial Economics & Trade, Kim Yang-paeng, also noted that the sharp increase in exports is likely related to SK hynix’s supplies for TSMC’s final packaging of AI accelerators.

The report from Economic Daily News further highlights the strong momentum in NVIDIA’s AI chip shipments, with TSMC, as the key manufacturing partner, receiving steady advanced process orders.

The report from Yonhap News Agency also cited data from the industry ministry and the Korea International Trade Association released on August 11th, showing that South Korea’s memory exports to Taiwan in the first half of the year grew by 225.7% year-on-year, reaching USD 4.26 billion.

This growth significantly outpaces the overall increase in South Korea’s memory exports, which was 88.7%. Additionally, Taiwan has become South Korea’s third-largest market for memory exports in the first half of the year, climbing two spots to surpass Vietnam and the United States.

Another Korean media outlet, The Korea Herald, noted that since the 2010s, South Korea’s annual memory exports to Taiwan have ranged between USD 1 billion and 4 billion. The latest data indicates that this year’s export volume may set a new record, potentially reaching USD 8 billion.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Yonhap News AgencyEconomic Daily News and The Korea Herald.
2024-08-08

[News] SK hynix CEO: Demand for Memory Chips to Remain Robust until 1H25, Driven by HBM

As the demand for memory chips used in AI remains strong, prompting major memory companies to accelerate their pace on HBM3e and HBM4 qualification, SK hynix CEO Kwak Noh-jung stated on August 7 that driven by the high demand for memory chips like high-bandwidth memory (HBM), the market is expected to stay robust until the first half of 2025, according to a report by the Korea Economic Daily.

However, Kwak noted that the momentum beyond 2H25 “remains to be seen,” indicating that the company needs to study market conditions and the situation of supply and demand before making comments further. SK hynix clarified that was not an indication of a possible downturn.

According to the analysis by TrendForce, HBM’s share of total DRAM bit capacity is estimated to rise from 2% in 2023 to 5% in 2024 and surpass 10% by 2025. In terms of market value, HBM is projected to account for more than 20% of the total DRAM market value starting in 2024, potentially exceeding 30% by 2025.

SK hynix, as the current HBM market leader, said earlier in its earnings call in July that its HBM3e shipment is expected to surpass that of HBM3 in the third quarter, with HBM3e accounting for more than half of the total HBM shipments in 2024. In addition, it expects to begin supplying 12-layer HBM3e products to customers in the fourth quarter.

The report notes that for now, the company’s major focus would be on the sixth-generation HBM chips, HBM4, which is under development in collaboration with foundry giant TSMC. Its 12-layer HBM4 is expected to be launched in the second half of next year, according to the report.

Samsung, on the other hand, had been working since last year to become a supplier of NVIDIA’s HBM3 and HBM3e. In late July, it is said that Samsung’s HBM3 has passed NVIDIA’s qualification, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls. On August 6, the company denied rumors that its 8-layer HBM3e chips had cleared NVIDIA’s tests.

Notably, per a previous report from the South Korean newspaper Korea Joongang Daily, following Micron’s initiation of mass production of HBM3e in February 2024, it has recently secured an order from NVIDIA for the H200 AI GPU.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from the Korea Economic Daily and Korea Joongang Daily.
2024-08-07

[News] SK hynix Secures up to USD 450 Million Funding for Indiana Packaging Facility under CHIPS Act

SK hynix, the current High Bandwidth Memory (HBM) market leader, announced on August 6th that it has signed a non-binding preliminary memorandum of terms with the U.S. Department of Commerce to receive up to USD 450 million in proposed direct funding and access to proposed loans of USD 500 million as part of the CHIPS and Science Act. The funding, according to its press release, will be used to build a production base for semiconductor packaging in Indiana.

Earlier in April, the other two memory giants, Samsung and Micron, have secured funds under the CHIPS and Science Act as well, receiving USD 6.4 billion and USD 6.1 billion, respectively.

SK hynix also noted in its press release that it plans to seek from the U.S. Department of the Treasury a tax benefit equivalent of up to 25% of the qualified capital expenditures through the Investment Tax Credit program.

The South Korean memory chip maker also said that it will proceed with the construction of the Indiana production base as planned to provide AI memory products. Through this, it looks forward to contributing to build a more resilient supply chain of the global semiconductor industry.

The signing follows SK hynix’s announcement in April that it plans to invest USD 3.87 billion to build a production base for advanced packaging in Indiana in a move expected to create around 1,000 jobs. According to a previous report by The Wall Street Journal, the advanced packaging fab it is expected to commence operations by 2028.

As the major HBM supplier of AI giant NVIDIA, SK hynix has good reason to accelerate the pace of capacity expansion. The recent NVIDIA Blackwell B200, with each GPU utilizing 8 HBM3e chips, has also underscored SK hynix’s role in the critical components supply chain for the AI industry.

On the other hand, a week earlier, semiconductor equipment leader Applied Materials was said to be rejected for funding under the CHIPS act for a R&D center in Silicon Valley, which targets to develop next-generation chipmaking tools. It has tried to gain U.S. funding for a USD 4 billion facility in Sunnyvale, California, which was slated to be completed in 2026.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from SK hynix and The Wall Street Journal.
2024-08-07

[News] Chinese Tech Giants Reportedly Stockpile Samsung’s HBM ahead of Potential U.S. Restrictions

With the chip war between the two great powers heating up, the U.S. is reportedly mulling new measures to limit China’s access to AI memory. As the restrictions might be imposed as early as late August, rumor has it that Chinese tech giants like Huawei and Baidu, along with other startups, are stockpiling high bandwidth memory (HBM) semiconductors from Samsung Electronics, according to the latest report by Reuters.

Citing sources familiar with the matter, the report notes that these companies have increased their purchases of AI-capable semiconductors since early this year. One source states that in accordance with this trend, China contributed to around 30% of Samsung’s HBM revenue in 1H24.

Regarding the details of the potential restrictions, sources cited by Reuters said that the U.S. authority is anticipated to establish guidelines for restricting access to HBM chips. While the U.S. Department of Commerce declined to comment, it did state last week that the government is continually evaluating the evolving threat landscape and updating export controls.

The Big Three in the memory sector, Samsung, SK hynix and Micron, are all working on their 4th generation (HBM3) and 5th generation (HBM3e) products, while closely cooperating with AI giants such as NVIDIA and AMD in developing AI accelerators.

Reuters notes that the surging HBM demand from China recently has primarily focused on HBM2e, which is two generations behind HBM3e. However, as the capacities of other manufacturers are already fully booked by other American AI companies, China has turned to Samsung for its HBM demand.

Sources cited by Reuters also indicate that a wide range of businesses, from satellite manufacturers to tech firms like Tencent, have been purchasing these HBM chips. Meanwhile, Huawei has been using Samsung HBM2e to produce its advanced Ascend AI chip, according to one of the sources. It is also reported that Chinese memory giant ChangXin Memory Technologies (CXMT) has started mass production of HBM2.

Samsung and SK hynix declined to comment, neither did Micron, Baidu, Huawei and Tencent respond to requests for comment, Reuters notes.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Reuters.
2024-08-06

[News] ChangXin Memory Technologies in China Has Reportedly Begun Mass Production of HBM2

According to a report from Tom’s Hardware citing industry sources, it’s indicated that Chinese memory giant ChangXin Memory Technologies (CXMT) has started mass production of HBM2. If confirmed, this is approximately two years ahead of the expected timeline, although the yield rate for HBM2 is still uncertain.

Earlier, Nikkei once reported that CXMT had begun procuring equipment necessary for HBM production, estimating it would take one to two years to achieve mass production. Currently, CXMT has ordered equipment from suppliers in the U.S. and Japan, with American companies Applied Materials and Lam Research having received export licenses.

Reportedly, HBM2 has a per-pin data transfer rate of approximately 2 GT/s to 3.2 GT/s. Producing HBM2 does not require the latest lithography techniques but does demand enough manufacturing capacity.

The process involves using through-silicon vias (TSV) to vertically connect memory components, which is rather complex. However, packaging the HBM KGSD (known good stack die) modules is still less intricate than manufacturing traditional DRAM devices using a 10nm process.

CXMT’s DRAM technology is said to be lagging behind that of Micron, Samsung, and SK hynix. These three companies have already started mass production of HBM3 and HBM3e and are preparing to advance to HBM4 in the coming years.

There also are reports indicating that Huawei, the Chinese tech giant subject to US sanctions, looks to collaborate with other local companies to produce HBM2 by 2026. Per a previous report from The Information, a group led by Huawei aimed at producing HBM includes Fujian Jinhua Integrated Circuit.

Moreover, since Huawei’s Ascend 910 series processors use HBM2, it has made HBM2 a crucial technology for advanced AI and HPC processors in China. Therefore, local manufacturing of HBM2 is a significant milestone for the country.

Read more

(Photo credit: CXMT)

Please note that this article cites information from Tom’s HardwareNikkei and The Information.

  • Page 6
  • 27 page(s)
  • 133 result(s)

Get in touch with us