HBM3e


2024-08-06

[News] Samsung to Benefit from NVIDIA’s Reported Blackwell Delay as Tech Giants Turn to AMD

As AI giant NVIDIA is said to delay its upcoming Blackwell series chips for months, which are now expected to hit the market around early 2025, the related semiconductor supply chain is experiencing a reshuffle. According to a report by the Korea Economic Daily, Samsung Electronics, which is eager to expand its market share for HBM3 and HBM3e, is likely to emerge as a major beneficiary in addition to AMD.

In March, NVIDIA introduced the Blackwell series, claiming it could enable customers to build and operate real-time generative AI on trillion-parameter large language models at up to 25 times less cost and energy consumption compared to its predecessor.

However, according to The Information, last week, NVIDIA informed major customers, including Google and Microsoft, that the shipments of its Blackwell AI accelerator would be delayed by at least three months due to design flaws.

Blackwell Delayed Potentially due to Design Flaws and TSMC’s Capacity Constraints

Tech media Overclocking points out that the defect is related to the part connecting the two GPUs, and creates problems for NVIDIA’s dual GPU versions, including the B200 and the GB200.

The delay has prompted tech companies to look for alternatives from NVIDIA’s competitors, such as AMD, according to the Korea Economic Daily. Microsoft and Google have already been working on next-generation products with AMD. For instance, Microsoft has purchased the MI300X, an AI accelerator from the US fabless semiconductor designer, the report says.

Samsung to Benefit thanks to the Collaboration with AMD

Samsung, as its HBM3 received AMD MI300 series certification in 1Q24, and is likely to provide HBM3e chips to AMD afterwards, is expected to benefit. Citing a semiconductor industry source, the Korea Economic Daily notes that as it is very risky for a single company to dominate the AI chip supply chain, the situation will create opportunities for Samsung and AMD.

It is also worth noting that Samsung’s HBM3 has passed NVIDIA’s qualification earlier, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls.

According to TrendForce’s forecast in mid-July, the shipment share of AI servers equipped with self-developed ASIC chips in 2024 is expected to exceed 25%, while NVIDIA owning the lion’s share of 63.6%. AMD’s market share, on the other hand, is projected to reach 8.1% in 2024.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from The Korea Economic DailyThe Information, SemiAnalysis and Overclocking.
2024-08-06

[News] ChangXin Memory Technologies in China Has Reportedly Begun Mass Production of HBM2

According to a report from Tom’s Hardware citing industry sources, it’s indicated that Chinese memory giant ChangXin Memory Technologies (CXMT) has started mass production of HBM2. If confirmed, this is approximately two years ahead of the expected timeline, although the yield rate for HBM2 is still uncertain.

Earlier, Nikkei once reported that CXMT had begun procuring equipment necessary for HBM production, estimating it would take one to two years to achieve mass production. Currently, CXMT has ordered equipment from suppliers in the U.S. and Japan, with American companies Applied Materials and Lam Research having received export licenses.

Reportedly, HBM2 has a per-pin data transfer rate of approximately 2 GT/s to 3.2 GT/s. Producing HBM2 does not require the latest lithography techniques but does demand enough manufacturing capacity.

The process involves using through-silicon vias (TSV) to vertically connect memory components, which is rather complex. However, packaging the HBM KGSD (known good stack die) modules is still less intricate than manufacturing traditional DRAM devices using a 10nm process.

CXMT’s DRAM technology is said to be lagging behind that of Micron, Samsung, and SK hynix. These three companies have already started mass production of HBM3 and HBM3e and are preparing to advance to HBM4 in the coming years.

There also are reports indicating that Huawei, the Chinese tech giant subject to US sanctions, looks to collaborate with other local companies to produce HBM2 by 2026. Per a previous report from The Information, a group led by Huawei aimed at producing HBM includes Fujian Jinhua Integrated Circuit.

Moreover, since Huawei’s Ascend 910 series processors use HBM2, it has made HBM2 a crucial technology for advanced AI and HPC processors in China. Therefore, local manufacturing of HBM2 is a significant milestone for the country.

Read more

(Photo credit: CXMT)

Please note that this article cites information from Tom’s HardwareNikkei and The Information.

2024-08-02

[News] Samsung’s Chip Head Raises the Urgency to Reform Company Culture to Avoid Vicious Cycles

According to a report from Bloomberg, Jun Young-hyun, head of Samsung’s chip business, recently sent a stern warning to employees about the need to reform the company’s culture to avoid falling into a vicious cycle.

Jun stated that the recent improvement in Samsung’s performance was due to a rebound in the memory market. To sustain this progress, Samsung must take measures to eliminate communication barriers between departments and stop concealing or avoiding problems.

Earlier this week, Samsung announced its Q2 earnings, showcasing the fastest net profit growth since 2010. However, Jun Young-hyun highlighted several issues which may undermine Samsung’s long-term competitiveness.

He emphasized the need to rebuild the semiconductor division’s culture of vigorous debate, warning that relying solely on market recovery without restoring fundamental competitiveness would lead to a vicious cycle and repeating past mistakes.

Samsung is still striving to close the gap with its competitors. The company is working to improve the maturity of its 2nm process to meet the high-performance, low-power demands of advanced processes. Samsung’s the first-generation 3nm GAA process has achieved yield maturity and is set for mass production in the second half of the year.

In memory, Samsung is beginning to narrow the gap with SK Hynix in high-bandwidth memory (HBM). According to Bloomberg, Samsung has received certification for HBM3 chips from NVIDIA and expects to gain certification for the next-generation HBM3e within two to four months.

Jun emphasized that although Samsung is in a challenging situation, he is confident that with accumulated experience and technology, the company can quickly regain its competitive edge.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Samsung and Bloomberg.
2024-08-01

[News] Samsung’s 8-layer HBM3e to Start Mass Production in Q3, Driving HBM Sales to Soar 3-5 Times in 2H24

Samsung Electronics, which has been surround by concerns that its HBM3e products are still struggling to pass NVIDIA’s qualifications, has confirmed in its second quarter earnings call that the company’s fifth-generation 8-layer HBM3e is currently undergoing customer valuation, and is scheduled to enter mass production in the third quarter, according to a report by Business Korea.

TrendForce notes that Samsung’s recent progress on HBM3e qualification seems to be solid, and we can soon expect both 8hi and 12hi to be qualified in the near future. The company is eager to gain higher HBM market share from SK hynix so its 1alpha capacity has reserved for HBM3e. TrendForce believes that Samsung is going to be a very important supplier on HBM category.

Driven by the momentum, the report from Business Korea, citing an official speaking at the conference call on July 31st, states that the share of HBM3e chips within Samsung’s HBMs is anticipated to surpass the mid-10 percent range in the third quarter. Moreover, it is projected to speedily grow to 60% by the fourth quarter.

According to Samsung, its HBM sales in the second quarter already grew by around 50% from the previous quarter. Being ambitious about its HBM3 and HBM3e sales, Samsung projects its HBM sales will increase three to five times in the second half of 2024, driven by a steep rise of about two times each quarter.

Samsung has already taken a big leap on HBM as its HBM3 chips are said to have been cleared by NVIDIA last week. According to a previous report by Reuters, Samsung’s HBM3 will initially be used exclusively in the AI giant’s H20, which is tailored for the Chinese market.

On the other hand, the South Korean memory giant notes that it has completed the preparations for volume production of its 12-layer HBM3e chips. The company plans to expand the supply in the second half of 2024 to meet the schedules requested by multiple customers, according to Business Korea. The progress of its sixth-generation HBM4 is also on track, scheduled to begin shipping in the second half of 2025, Business Korea notes.

Samsung Electronics reported higher-than-expected financial results in the second quarter, with a six-fold year-on-year increase in net income, soaring from KRW 1.55 trillion won (USD 1.12 billion) to KRW 9.64 trillion (USD 6.96 billion), as demand for its advanced memory chips that are crucial for AI training remained strong.

SK hynix, as the current HBM market leader, has expressed its optimism in securing the throne as well. The company reportedly expects its HBM3e shipments to surpass those of HBM3 in the third quarter, with HBM3e accounting for more than half of the total HBM shipments in 2024. In addition, it expects to begin supplying 12-layer HBM3e products to customers in the fourth quarter.

Micron, on the other hand, has reportedly started mass production of 8-layer HBM3e as early as in February. The company reportedly plans to complete preparations for mass production of 12-layer HBM3e in the second half and supply it to major customers like NVIDIA in 2025.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Business Korea and Reuters.
2024-08-01

[News] US Reportedly Weighs Stricter Limits on AI Memory Access for China

According to a report from Bloomberg, the US is reportedly considering new measures and could unilaterally impose restrictions on China as early as late August. These measures would limit China’s access to AI memory and related equipment capable of producing them.

Moreover, another report from Reuters further indicates that US allies, including semiconductor equipment manufacturers from Japan, the Netherlands, and South Korea—such as major Dutch semiconductor equipment maker ASML and Tokyo Electron—will not be affected in their shipments. The report also notes that countries whose exports will be impacted include Israel, Taiwan, Singapore, and Malaysia. 

Bloomberg, citing sources, revealed that the purpose of these measures is to prevent major memory manufacturers like Micron, SK hynix, and Samsung Electronics from selling high-bandwidth memory (HBM) to China.

These three companies dominate the global HBM market. Reportedly, regarding this matter, Micron declined to comment, while Samsung and SK hynix did not immediately respond to requests for comment. 

Bloomberg’s source also emphasized that the US has yet made a final decision. The source also state that if implemented, the new measures would cover chips such as HBM2, HBM3, and HBM3e, as well as the equipment needed to manufacture these chips.

The source further revealed that Micron will essentially not be affected by the new regulations, as Micron stopped exporting HBM to China after China banned Micron’s memory from being used in critical infrastructure in 2023.

Reportedly, it is still unclear what methods the US will use to restrict South Korean companies. One possibility is the Foreign Direct Product Rule (FDPR). Under this rule, if a foreign-made product uses any US technology, even just a small amount, the US can impose restrictions.

Both SK hynix and Samsung are said to be relying on chip design software and equipment from US companies such as Cadence Design Systems and Applied Materials.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Bloomberg and Reuters .

  • Page 3
  • 11 page(s)
  • 51 result(s)

Get in touch with us