News
According to a report from Reuters citing industry sources, Samsung Electronics’ fifth-generation high-bandwidth memory (HBM3e) have passed tests by NVIDIA and could be used in NVIDIA’s AI processors.
The report further indicates that while no supply contract has been signed yet, one is expected soon, with potential deliveries starting in the fourth quarter of this year. The news also notes that the tested HBM3e chips are 8-layer, while Samsung’s 12-layer HBM3e have yet passed test.
However, in response to the matter, Samsung Electronics stated in a report from BusinessKorea on August 7 that they could not confirm stories related to their customers and that the report was not true.
The Samsung Electronics official cited by BusinessKorea also mentioned that, as previously stated during a conference call last month, the quality testing is still ongoing and there have been no updates since then.
Samsung had been working since last year to become a supplier of NVIDIA’s HBM3 and HBM3e. In late July, it is said that Samsung’s HBM3 has passed NVIDIA’s qualification, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls.
Read more
(Photo credit: Samsung)
News
With the chip war between the two great powers heating up, the U.S. is reportedly mulling new measures to limit China’s access to AI memory. As the restrictions might be imposed as early as late August, rumor has it that Chinese tech giants like Huawei and Baidu, along with other startups, are stockpiling high bandwidth memory (HBM) semiconductors from Samsung Electronics, according to the latest report by Reuters.
Citing sources familiar with the matter, the report notes that these companies have increased their purchases of AI-capable semiconductors since early this year. One source states that in accordance with this trend, China contributed to around 30% of Samsung’s HBM revenue in 1H24.
Regarding the details of the potential restrictions, sources cited by Reuters said that the U.S. authority is anticipated to establish guidelines for restricting access to HBM chips. While the U.S. Department of Commerce declined to comment, it did state last week that the government is continually evaluating the evolving threat landscape and updating export controls.
The Big Three in the memory sector, Samsung, SK hynix and Micron, are all working on their 4th generation (HBM3) and 5th generation (HBM3e) products, while closely cooperating with AI giants such as NVIDIA and AMD in developing AI accelerators.
Reuters notes that the surging HBM demand from China recently has primarily focused on HBM2e, which is two generations behind HBM3e. However, as the capacities of other manufacturers are already fully booked by other American AI companies, China has turned to Samsung for its HBM demand.
Sources cited by Reuters also indicate that a wide range of businesses, from satellite manufacturers to tech firms like Tencent, have been purchasing these HBM chips. Meanwhile, Huawei has been using Samsung HBM2e to produce its advanced Ascend AI chip, according to one of the sources. It is also reported that Chinese memory giant ChangXin Memory Technologies (CXMT) has started mass production of HBM2.
Samsung and SK hynix declined to comment, neither did Micron, Baidu, Huawei and Tencent respond to requests for comment, Reuters notes.
Read more
(Photo credit: Samsung)
News
As AI giant NVIDIA is said to delay its upcoming Blackwell series chips for months, which are now expected to hit the market around early 2025, the related semiconductor supply chain is experiencing a reshuffle. According to a report by the Korea Economic Daily, Samsung Electronics, which is eager to expand its market share for HBM3 and HBM3e, is likely to emerge as a major beneficiary in addition to AMD.
In March, NVIDIA introduced the Blackwell series, claiming it could enable customers to build and operate real-time generative AI on trillion-parameter large language models at up to 25 times less cost and energy consumption compared to its predecessor.
However, according to The Information, last week, NVIDIA informed major customers, including Google and Microsoft, that the shipments of its Blackwell AI accelerator would be delayed by at least three months due to design flaws.
Blackwell Delayed Potentially due to Design Flaws and TSMC’s Capacity Constraints
Tech media Overclocking points out that the defect is related to the part connecting the two GPUs, and creates problems for NVIDIA’s dual GPU versions, including the B200 and the GB200.
The delay has prompted tech companies to look for alternatives from NVIDIA’s competitors, such as AMD, according to the Korea Economic Daily. Microsoft and Google have already been working on next-generation products with AMD. For instance, Microsoft has purchased the MI300X, an AI accelerator from the US fabless semiconductor designer, the report says.
Samsung to Benefit thanks to the Collaboration with AMD
Samsung, as its HBM3 received AMD MI300 series certification in 1Q24, and is likely to provide HBM3e chips to AMD afterwards, is expected to benefit. Citing a semiconductor industry source, the Korea Economic Daily notes that as it is very risky for a single company to dominate the AI chip supply chain, the situation will create opportunities for Samsung and AMD.
It is also worth noting that Samsung’s HBM3 has passed NVIDIA’s qualification earlier, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls.
According to TrendForce’s forecast in mid-July, the shipment share of AI servers equipped with self-developed ASIC chips in 2024 is expected to exceed 25%, while NVIDIA owning the lion’s share of 63.6%. AMD’s market share, on the other hand, is projected to reach 8.1% in 2024.
Read more
(Photo credit: NVIDIA)
News
According to a report from Bloomberg, Jun Young-hyun, head of Samsung’s chip business, recently sent a stern warning to employees about the need to reform the company’s culture to avoid falling into a vicious cycle.
Jun stated that the recent improvement in Samsung’s performance was due to a rebound in the memory market. To sustain this progress, Samsung must take measures to eliminate communication barriers between departments and stop concealing or avoiding problems.
Earlier this week, Samsung announced its Q2 earnings, showcasing the fastest net profit growth since 2010. However, Jun Young-hyun highlighted several issues which may undermine Samsung’s long-term competitiveness.
He emphasized the need to rebuild the semiconductor division’s culture of vigorous debate, warning that relying solely on market recovery without restoring fundamental competitiveness would lead to a vicious cycle and repeating past mistakes.
Samsung is still striving to close the gap with its competitors. The company is working to improve the maturity of its 2nm process to meet the high-performance, low-power demands of advanced processes. Samsung’s the first-generation 3nm GAA process has achieved yield maturity and is set for mass production in the second half of the year.
In memory, Samsung is beginning to narrow the gap with SK Hynix in high-bandwidth memory (HBM). According to Bloomberg, Samsung has received certification for HBM3 chips from NVIDIA and expects to gain certification for the next-generation HBM3e within two to four months.
Jun emphasized that although Samsung is in a challenging situation, he is confident that with accumulated experience and technology, the company can quickly regain its competitive edge.
Read more
(Photo credit: Samsung)
News
Samsung Electronics, which has been surround by concerns that its HBM3e products are still struggling to pass NVIDIA’s qualifications, has confirmed in its second quarter earnings call that the company’s fifth-generation 8-layer HBM3e is currently undergoing customer valuation, and is scheduled to enter mass production in the third quarter, according to a report by Business Korea.
TrendForce notes that Samsung’s recent progress on HBM3e qualification seems to be solid, and we can soon expect both 8hi and 12hi to be qualified in the near future. The company is eager to gain higher HBM market share from SK hynix so its 1alpha capacity has reserved for HBM3e. TrendForce believes that Samsung is going to be a very important supplier on HBM category.
Driven by the momentum, the report from Business Korea, citing an official speaking at the conference call on July 31st, states that the share of HBM3e chips within Samsung’s HBMs is anticipated to surpass the mid-10 percent range in the third quarter. Moreover, it is projected to speedily grow to 60% by the fourth quarter.
According to Samsung, its HBM sales in the second quarter already grew by around 50% from the previous quarter. Being ambitious about its HBM3 and HBM3e sales, Samsung projects its HBM sales will increase three to five times in the second half of 2024, driven by a steep rise of about two times each quarter.
Samsung has already taken a big leap on HBM as its HBM3 chips are said to have been cleared by NVIDIA last week. According to a previous report by Reuters, Samsung’s HBM3 will initially be used exclusively in the AI giant’s H20, which is tailored for the Chinese market.
On the other hand, the South Korean memory giant notes that it has completed the preparations for volume production of its 12-layer HBM3e chips. The company plans to expand the supply in the second half of 2024 to meet the schedules requested by multiple customers, according to Business Korea. The progress of its sixth-generation HBM4 is also on track, scheduled to begin shipping in the second half of 2025, Business Korea notes.
Samsung Electronics reported higher-than-expected financial results in the second quarter, with a six-fold year-on-year increase in net income, soaring from KRW 1.55 trillion won (USD 1.12 billion) to KRW 9.64 trillion (USD 6.96 billion), as demand for its advanced memory chips that are crucial for AI training remained strong.
SK hynix, as the current HBM market leader, has expressed its optimism in securing the throne as well. The company reportedly expects its HBM3e shipments to surpass those of HBM3 in the third quarter, with HBM3e accounting for more than half of the total HBM shipments in 2024. In addition, it expects to begin supplying 12-layer HBM3e products to customers in the fourth quarter.
Micron, on the other hand, has reportedly started mass production of 8-layer HBM3e as early as in February. The company reportedly plans to complete preparations for mass production of 12-layer HBM3e in the second half and supply it to major customers like NVIDIA in 2025.
Read more
(Photo credit: Samsung)