News
According to sources cited in a report from Reuters, NVIDIA is said to be planning to design a new flagship AI chip tailored for the Chinese market, which will still comply with current U.S. export control regulations.
NVIDIA, the global AI chip giant, unveiled its Blackwell chip series in March this year, with mass production expected to start later this year. The B200 chip in this series boasts powerful performance, capable of completing chatbot response tasks at speeds up to 30 times faster than the previous generation.
The sources cited by Reuters further point out that NVIDIA will collaborate with China’s Inspur to launch and sell this chip, tentatively codenamed B20. Inspur is one of NVIDIA’s primary distribution partners in China.
Currently, NVIDIA’s spokesperson has declined to comment on this news, and Inspur has also not issued any statements.
The U.S. government, citing national security concerns, began strictly tightening controls on the export of advanced semiconductors to China in 2023. Since then, NVIDIA has released three chips specifically for the Chinese market.
Per a previous report from TechNews citing industry sources, it is also believed that the US will significantly escalate the trade war after the presidential election, intensifying export restrictions on China.
It is noteworthy that the US government previously announced the imposition or increase of tariffs on Chinese electric vehicles, semiconductors, lithium batteries, and other products, with the semiconductor tariff rate set to rise from 25% to 50% by 2025. Meanwhile, for the future direction of the US, it can be inferred that chips manufactured in Taiwan and South Korea may also face tariffs.
Read more
(Photo credit: NVIDIA)
News
Samsung’s HBM, according to a report from TechNews, has yet to pass certification by GPU giant NVIDIA, causing it to fall behind its competitor SK Hynix. As a result, the head of Samsung’s semiconductor division was replaced. Although Samsung denies any issues with their HBM and emphasizes close collaboration with partners, TechNews, citing market sources, indicates that Samsung has indeed suffered a setback.
Samsung invested early in HBM development and collaborated with NVIDIA on HBM and HBM2, but sales were modest. Eventually, the HBM team, according to TechNews’ report, moved to SK Hynix to develop HBM products. Unexpectedly, the surge in generative AI led to a sharp increase in HBM demand, and SK Hynix, benefitting from the trend, seized the opportunity with the help of the team.
Yet, in response to the rumors about changes in the HBM team, SK Hynix has denied the claims that SK Hynix developed HBM with the help of the Samsung team and also denied the information that Samsung’s HBM team transferred to SK Hynix. SK Hynix further emphasized the fact that SK Hynix’s HBM was developed solely by its own engineers.
Samsung’s misfortune is evident; despite years of effort, they faced setbacks just as the market took off. Samsung must now find alternative ways to catch up. The market still needs Samsung, as noted by Wallace C. Kou, President of memory IC design giant Silicon Motion.
Kou reportedly stated that Samsung remains the largest memory producer, and as NVIDIA faces a supply shortage for AI chips, the GPU giant is keen to cooperate with more suppliers. Therefore, it’s only a matter of time before Samsung supplies HBM to NVIDIA.
Furthermore, Samsung also indicated in a recent statement, addressing that they are conducting HBM tests with multiple partners to ensure quality and reliability.
In the statement, Samsung indicates that it is in the process of optimizing their products through close collaboration with its customers, with testing proceeding smoothly and as planned. As HBM is a customized memory product, it requires optimization processes in line with customers’ needs.
Samsung also states that it is currently partnering closely with various companies to continuously test technology and performance, and to thoroughly verify the quality and performance of its HBM.
On the other hand, NVIDIA has various GPUs adopting HBM3e, including H200, B200, B100, and GB200. Although all of them require HBM3e stacking, their power consumption and heat dissipation requirements differ. Samsung’s HBM3e may be more suitable for H200, B200, and AMD Instinct MI350X.
Read more
(Photo credit: SK Hynix)