In the industry buzz, it’s reported that TSMC expects a significant upswing in the proportion of AI orders within its 2024 revenue, driven by the increased demand for wafer starts from its six key AI customer groups in the coming year.
These six major AI customer groups encompass NVIDIA, AMD, Tesla, Apple, Intel, and international giants with in-house AI chip development, entrusting TSMC for production. The orders in this domain continue to heat up, not only benefiting TSMC but also signaling a robust year ahead for AI server manufacturer like Quanta and Wistron.
TSMC traditionally refrains from commenting on specific customer details and remained silent on market speculations on the October 10th. Meanwhile, AI server manufacturers, including Quanta and Wistron, hold a positive outlook for the upcoming year, with expectations of a continued upward trend in AI-related business operations.
As the demand for AI wafer starts from key customers intensifies, market experts are keenly watching TSMC’s investor conference on the October 19th. There is anticipation regarding whether TSMC will revise its previous July forecast by further increasing the Compound Annual Growth Rate (CAGR) of AI-related product revenue for the next five years.
TSMC categorizes server AI processors as those handling training and inference functions, including CPUs, GPUs, and AI accelerators. This category accounts for approximately 6% of TSMC’s total revenue. During TSMC’s July investor conference, it was projected that the demand for AI-related products would see a nearly 50% Compound Annual Growth Rate (CAGR) increase over the next five years, pushing its revenue share into the low teens range.
(Photo credit: TSMC)