The AI wave continues to fuel surging demand for AI chips. Following reports of HBM sellouts and manufacturers ramping up production to meet demand, recent news reveals that Nvidia’s Blackwell architecture GPUs are also in short supply.
Nvidia’s Blackwell GPUs Sold Out for the Next 12 Months
Although Nvidia’s Blackwell architecture GPUs are delayed until Q4 of this year, it hasn’t dampened orders.
According to Tom’s Hardware, Morgan Stanley recently held a three-day meeting in New York with Nvidia CEO Jensen Huang, CFO Colette Kress, and other members of the chipmaker’s management team.
Morgan Stanley reported that Nvidia stated that orders for Blackwell architecture GPUs are sold out for the next 12 months, and new customers placing orders now won’t receive products until the end of 2025.
Existing customers, including AWS, CoreWeave, Google, Meta, Microsoft, and Oracle, have already purchased all of the Blackwell architecture GPUs that Nvidia and its partner TSMC can produce in the coming quarters.
The industry points out that the demand for high-performance GPUs and the AI chip market behind them remains frenetic, and the competition between major AI chip manufacturers such as Nvidia, AMD, and Intel will become increasingly fierce.
Three Memory Giants Seize HBM3e Opportunities, Highlighting the Importance of 12hi Products
Driven by the continuous iteration of high-performance AI chips and the expansion of HBM capacity per system, the demand for HBM bits continues to grow.
At the same time, with the iteration of mainstream GPU products from Nvidia and AMD, as well as changes in HBM specifications, the market will gradually upgrade from HBM3 to HBM3e. The three major memory manufacturers will actively seize HBM3e opportunities.
According to TrendForce, the annual growth rate of HBM demand bits will be close to 200% in 2024 and will double again in 2025.
TrendForce estimates that driven by the active adoption of new-generation HBM products by AI platforms, more than 80% of HBM demand bits will be for HBM3e generation products in 2025, of which 12-hi will account for more than half, becoming the mainstream product that major AI manufacturers will compete for in the second half of next year, followed by 8-hi.
Samsung, SK Hynix, and Micron have submitted their first batches of HBM3e 12-hi samples in the first half of 2024 and the third quarter, respectively, and are currently in the continuous verification stage. Among them, SK Hynix and Micron are progressing faster and are expected to complete verification by the end of this year.
(Photo credit: Nvidia)