Nvidia


2024-05-31

[News] US Reportedly Slows NVIDIA and AMD from Selling AI Chips to the Middle East

According to a report from Bloomberg, US officials have slowed down the issuance of licenses for chip manufacturers like NVIDIA and AMD to export large quantities of AI accelerators to the Middle East. Meanwhile, officials are conducting a national security assessment of AI developments in the region.

As per Bloomberg’s report citing sources, it is still unclear how long the assessment will take and what exactly constitutes a large-scale export. They said that officials are particularly concerned about large-scale sales because countries like the UAE and Saudi Arabia are looking to import significant quantities of chips for AI data centers.

AI accelerators can help data centers process the massive amounts of information required for developing AI chatbots and other tools. They have become essential equipment for companies and governments seeking to build AI infrastructure.

Reportedly, sources have revealed that slowing down exports is intended to give Washington time to formulate a comprehensive strategy on how advanced chips should be deployed overseas. Some of these sources mentioned that this includes negotiating who will manage and secure the facilities used to train AI models.

The US Department of Commerce stated in a statement that “protecting national security” is the top priority.

“With regards to the most cutting edge technologies, we conduct extensive due diligence through an interagency process, thoroughly reviewing license applications from applicants who intend to ship these advanced technologies around the world,” a representative for the department said. “As always, we remain committed to working with our partners in the Middle East and around the world to safeguard our technological ecosystem.”

Addressing national security concerns, earlier this month, the U.S. government has reportedly revoked the licenses of Intel and Qualcomm to supply semiconductor chips used in laptops and handsets to Huawei. According to Reuters citing sources, some companies received notices on May 7th, and the revocation of the licenses took immediate effect.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from Bloomberg and Reuters.

2024-05-30

[News] SK Hynix Disclosed Details Regarding HBM4e, Reportedly Integrating Computing and Caching Functionalities

As the demand for AI chips keeps booming, memory giants have been aggressive in their HBM roadmaps. SK hynix, with its leading market position in HBM3e, has now revealed more details regarding HBM4e. According to reports by Wccftech and ET News, SK hynix plans to further distinguish itself by introducing an HBM variant capable of supporting multiple functionalities including computing, caching, and network memory.

While this concept is still in the early stages, SK hynix has begun acquiring semiconductor design IPs to support its objectives, the aforementioned reports noted.

According to ET News, the memory giant intends to establish the groundwork for a versatile HBM with its forthcoming HBM4 architecture. The company reportedly plans to integrate a memory controller onboard, paving the way for new computing capabilities with its 7th-generation HBM4e memory.

By employing SK hynix’s technique, the package will become a unified unit. This will not only ensure faster transfer speeds due to significantly reduced structural gaps but also lead to higher power efficiencies, according to the reports.

Previously in April, SK hynix announced that it has been collaborating with TSMC to produce next-generation HBM and enhance logic and HBM integration through advanced packaging technology. The company plans to proceed with the development of HBM4 slated to be mass produced from 2026, in the initiative.

As more details relating to HBM4 have been revealed now, the memory heavyweight seems to extend its leading market position in HBM3 by addressing the semiconductor aspect of the HBM structure, Wccftech said.

According to TrendForce’s analysis earlier, as of early 2024, the current landscape of the HBM market is primarily focused on HBM3. NVIDIA’s upcoming B100 or H200 models will incorporate advanced HBM3e, signaling the next step in memory technology. The current HBM3 supply for NVIDIA’s H100 solution is primarily met by SK hynix, leading to a supply shortfall in meeting burgeoning AI market demands.

In late May, SK hynix has disclosed yield details regarding HBM3e for the first time. According to a report from the Financial Times, the memory giant has successfully reduced the time needed for mass production of HBM3e chips by 50%, while close to achieving the target yield of 80%.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Wccftech and ET News.
2024-05-29

[News] Rumors Hint at Samsung Losing HBM Edge Due to Talent Shift to SK Hynix; SK Hynix Denies the Claims

Samsung’s HBM, according to a report from TechNews, has yet to pass certification by GPU giant NVIDIA, causing it to fall behind its competitor SK Hynix. As a result, the head of Samsung’s semiconductor division was replaced. Although Samsung denies any issues with their HBM and emphasizes close collaboration with partners, TechNews, citing market sources, indicates that Samsung has indeed suffered a setback.

Samsung invested early in HBM development and collaborated with NVIDIA on HBM and HBM2, but sales were modest. Eventually, the HBM team, according to TechNews’ report, moved to SK Hynix to develop HBM products. Unexpectedly, the surge in generative AI led to a sharp increase in HBM demand, and SK Hynix, benefitting from the trend, seized the opportunity with the help of the team.

Yet, in response to the rumors about changes in the HBM team, SK Hynix has denied the claims that SK Hynix developed HBM with the help of the Samsung team and also denied the information that Samsung’s HBM team transferred to SK Hynix. SK Hynix further emphasized the fact that SK Hynix’s HBM was developed solely by its own engineers.

Samsung’s misfortune is evident; despite years of effort, they faced setbacks just as the market took off. Samsung must now find alternative ways to catch up. The market still needs Samsung, as noted by Wallace C. Kou, President of memory IC design giant Silicon Motion.

Kou reportedly stated that Samsung remains the largest memory producer, and as NVIDIA faces a supply shortage for AI chips, the GPU giant is keen to cooperate with more suppliers. Therefore, it’s only a matter of time before Samsung supplies HBM to NVIDIA.

Furthermore, Samsung also indicated in a recent statement, addressing that they are conducting HBM tests with multiple partners to ensure quality and reliability.

In the statement, Samsung indicates that it is in the process of optimizing their products through close collaboration with its customers, with testing proceeding smoothly and as planned. As HBM is a customized memory product, it requires optimization processes in line with customers’ needs.

Samsung also states that it is currently partnering closely with various companies to continuously test technology and performance, and to thoroughly verify the quality and performance of its HBM.

On the other hand, NVIDIA has various GPUs adopting HBM3e, including H200, B200, B100, and GB200. Although all of them require HBM3e stacking, their power consumption and heat dissipation requirements differ. Samsung’s HBM3e may be more suitable for H200, B200, and AMD Instinct MI350X.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from TechNews.

2024-05-24

[News] NVIDIA Reportedly Facing Price Cut Pressure of H20 Chip in China Amid Competition with Huawei

In response to US export bans, NVIDIA, the global leader in AI chips, has commenced to sell H20, its AI chip tailored for the Chinese market earlier this year. However, an oversupply caused the chip to be priced lower than its rival, Huawei, in some cases even at an over 10% discount, according to the latest report by Reuters.

The US Department of Commerce restricted the export of NVIDIA AI chips to China due to concerns about their potential military use in late 2022. In response, NVIDIA has repeatedly reduced product performance to comply with US regulations. The H20 chip, derived from the H800, is specifically designed as a ‘special edition’ for the Chinese market.

However, due to the abundant supply in the market, citing sources familiar with the matter, Reuters noted that H20 chips are being sold at a discount of over 10% compared to Huawei’s Ascend 910B, the most powerful AI chip from the Chinese tech giant.

The chip is reportedly to be sold at approximately 100,000 yuan per unit, while Huawei 910B sold at over 120,000 yuan per unit.

The decreasing prices underscore the difficulties NVIDIA encounters in its China operations amid U.S. sanctions on AI chip exports and rising competition from local rivals.

According to a previous report by The Information, major tech companies such as Alibaba, Baidu, ByteDance, and Tencent have been instructed to reduce their spending on foreign-made chips like NVIDIA’s, according to sources cited by the media outlet.

(Photo credit: Huawei)

Please note that this article cites information from Reuters and The Information.

 

2024-05-24

[News] NVIDIA’s Packaging Advancement Boosts Taiwanese Supply Chain with Emerging Opportunities in Panel-Level Fan-Out Packaging

To alleviate the capacity constraints of CoWoS advanced packaging, NVIDIA is reportedly planning to accelerate the introduction of its GB200, into panel-level fan-out packaging. According to a report from Economic Daily News, originally scheduled for 2026, this shift has been moved up to 2025, sparking opportunities in the panel-level fan-out packaging sector.

Taiwanese companies like Powertech Technology Inc. (PTI) and AU Optronics (AUO) are said to have prepared with the necessary capabilities, expected to seize this market opportunity.

The sources cited by the report from Economic Daily News explain that fan-out packaging has two branches: wafer-level fan-out packaging (FOWLP) and panel-level fan-out packaging (FOPLP). Among Taiwanese packaging and testing companies, PTI is reportedly the fastest in deploying panel-level fan-out packaging.

To capture the high-end logic chip packaging market, PTI has fully dedicated its Hsinchu Plant 3 to panel-level fan-out packaging and TSV CIS (CMOS image sensors) technologies, emphasizing that fan-out packaging can achieve heterogeneous integration of ICs.

PTI previously expressed optimism about the opportunities presented by the era of panel-level fan-out packaging, noting that it can produce chip areas two to three times larger than wafer-level fan-out packaging.

Innolux, a major panel manufacturer, is also optimistic, forecasting that 2024 will be the advanced packaging mass production inaugural year for the group. The first phase capacity of its fan-out panel-level packaging (FOPLP) production line has already been fully booked, with mass production and shipments scheduled to begin in the third quarter of this year.

Chairman of Innolux Jim Hung emphasized that advanced packaging technology (PLP) connects chips through redistribution layers (RDL), meeting the requirements for high reliability, high power output, and high-quality packaging products. This technology has secured process and reliability certifications from top-tier customers, and its yield rates have been well received, with mass production set to commence this year.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from Economic Daily News.

  • Page 18
  • 46 page(s)
  • 230 result(s)

Get in touch with us