Despite recent rumors speculating on NVIDIA’s supposed cancellation of the B100 in favor of the B200A, TrendForce reports that NVIDIA is still on track to launch both the B100 and B200 in the 2H24 as it aims to target CSP customers. Additionally, a scaled-down B200A is planned for other enterprise clients, focusing on edge AI applications.
With the growing demand for high-speed computing, more effective cooling solutions for AI servers are gaining significant attention. TrendForce's latest report on AI servers reveals that NVIDIA is set to launch its next-generation Blackwell platform by the end of 2024. Major CSPs are expected to start building AI server data centers based on this new platform, potentially driving the penetration rate of liquid cooling solutions to 10%.
TrendForce’s latest report on the memory industry reveals that DRAM and NAND Flash revenues are expected to see significant increases of 75% and 77%, respectively, in 2024, driven by increased bit demand, an improved supply-demand structure, and the rise of high-value products like HBM.
TrendForce’s latest industry report on AI servers reveals that high demand for advanced AI servers from major CSPs and brand clients is expected to continue in 2024. Meanwhile, TSMC, SK hynix, Samsung, and Micron’s gradual production expansion has significantly eased shortages in 2Q24. Consequently, the lead time for NVIDIA’s flagship H100 solution has decreased from the previous 40–50 weeks to less than 16 weeks.
The AI hardware boom is in full swing: TrendForce reports that the first half of this year witnessed a robust increase in AI server orders. Looking ahead, the latter half of the year promises more to come as NVIDIA’s Blackwell GB200 servers and WoA AI-powered notebooks hit mass production and begin shipping in Q3. This surge is driving ODMs to ramp up their inventory procurement and setting the stage for a spike in orders and shipments of high-capacitance MLCCs. The outcome is a welcome stabilization of market prices and a notable uptick in ASP for suppliers.