H100


2023-08-16

[News] CoWoS Production Surges at TSMC, UMC, Amkor, and ASE Hasten to Catch Up

According to a report by Taiwan’s Commercial Times, JPMorgan’s latest analysis reveals that AI demand will remain robust in the second half of the year. Encouragingly, TSMC’s CoWoS capacity expansion progress is set to exceed expectations, with production capacity projected to reach 28,000 to 30,000 wafers per month by the end of next year.

The trajectory of CoWoS capacity expansion is anticipated to accelerate notably in the latter half of 2024. This trend isn’t limited to TSMC alone; other players outside the TSMC are also actively expanding their CoWoS-like production capabilities to meet the soaring demands of AI applications.

Gokul Hariharan, Head of Research for JPMorgan Taiwan, highlighted that industry surveys indicate strong and unabated AI demand in the latter half of the year. Shortages amounting to 20% to 30% are observed with CoWoS capacity being a key bottleneck and high-bandwidth memory (HBM) also facing supply shortages.

JPMorgan’s estimates indicate that Nvidia will account for 60% of the overall CoWoS demand in 2023. TSMC is expected to produce around 1.8 to 1.9 million sets of H100 chips, followed by significant demand from Broadcom, AWS’ Inferentia chips, and Xilinx. Looking ahead to 2024, TSMC’s continuous capacity expansion is projected to supply Nvidia with approximately 4.1 to 4.2 million sets of H100 chips.

Apart from TSMC’s proactive expansion of CoWoS capacity, Hariharan predicts that other assembly and test facilities are also accelerating their expansion of CoWoS-like capacities.

For instance, UMC is preparing to have a monthly capacity of 5,000 to 6,000 wafers for the interposer layer by the latter half of 2024. Amkor is expected to provide a certain capacity for chip-on-wafer stacking technology, and ASE Group will offer chip-on-substrate bonding capacity. However, these additional capacities might face challenges in ramping up production for the latest products like H100, potentially focusing more on older-generation products like A100 and A800.

(Photo credit: TSMC)

2023-05-25

Server Specification Upgrade: A Bountiful Blue Ocean for ABF Substrates

ChatGPT’s debut has sparked a thrilling spec upgrade in the server market, which has breathed new life into the supply chain and unlocked unparalleled business opportunities. Amidst all this, the big winners look set to be the suppliers of ABF (Ajinomoto Build-up Film) substrates, who are poised to reap enormous benefits.

In the previous article, “AI Sparks a Revolution Up In the Cloud,” we explored how the surge in data volumes is driving the spec of AI servers as well as the cost issue that comes with it. This time around, we’ll take a closer look at the crucial GPU and CPU platforms, focusing on how they can transform the ABF substrate market.

NVIDIA’s Dual-Track AI Server Chip Strategy Fuels ABF Consumption

In response to the vast data demands of fast-evolving AI servers, NVIDIA is leading the pack in defining the industry-standard specs.

This contrasts with standard GPU servers, where one CPU backs 2 to 6 GPUs. Instead, NVIDIA’s AI servers, geared towards DL(Deep Learning) and ML(Machine Learning), typically support 2 CPUs and 4 to 8 GPUs, thus doubling the ABF substrate usage compared to conventional GPU servers.

NVIDIA has devised a dual-track chip strategy, tailoring their offerings for international and Chinese markets. The primary chip for ChatGPT is NVIDIA’s A100. However, for China, in line with U.S. export regulations, they’ve introduced the A800 chip, reducing interconnect speeds from 600GBps (as on the A100) to 400GBps.

Their latest H100 GPU chip, manufactured at TSMC’s 4nm process, boasts an AI training performance 9 times greater than its A100 predecessor and inferencing power that’s 30 times higher. To match the new H100, H800 was also released with an interconnect speed capped at 300GBps. Notably, Baidu’s pioneering AI model, Wenxin, employs the A800 chip.

To stay competitive globally in AI, Chinese manufacturers are expected to aim for the computational prowess on par with the H100 and A100 by integrating more A800 and H800 chips. This move will boost the overall ABF substrate consumption.

With the ChatBot boom, it is predicted a 38.4% YoY increase in 2023’s AI server shipments and a robust CAGR of 22% from 2022 to 2026 – significantly outpacing the typical single-digit server growth, according to TrendForce’s prediction.

AMD, Intel Server Platforms Drive ABF Substrate Demand

Meanwhile, examining AMD and Intel’s high-end server platforms, we can observe how spec upgrades are propelling ABF substrate consumption forward.

  • AMD Zen 4:

Since 2019, AMD’s EPYC Zen 2 server processors have used Chiplet multi-chip packaging, which due to its higher conductivity and cooling demands, has consistently bolstered ABF substrate demand.

  • Intel Eagle Stream:

Intel’s advanced Eagle Stream Sapphire Rapids platform boasts 40-50% higher computation speed than its predecessor, the Whitley, and supports PCIe5, which triggers a 20% uptick in substrate layers. This platform employs Intel’s 2.5D EMIB tech and Silicon Bridge, integrating various chips to minimize signal transmission time.

The Sapphire Rapids lineup includes SPR XCC and the more advanced SPR HBM, with the latter’s ABF substrate area being 30% larger than the previous generation’s. The incorporation of EMIB’s Silicon Bridge within the ABF substrate increases lamination complexity and reduces overall yield. Simply put, for every 1% increase in Eagle Stream’s server market penetration, ABF substrate demand is projected to rise by 2%.

As the upgrades for server-grade ABF substrates continue to advance, production complexity, layer count, and area all increase correspondingly. This implies that the average yield rate might decrease from 60-70% to 40-50%. Therefore, the actual ABF substrate capacity required for future server CPU platforms will likely be more than double that of previous generations.

ABF Substrate Suppliers Riding the Tide

By our estimates, the global ABF substrate market size is set to grow from $9.3 billion in 2023 to $15 billion in 2026 – a CAGR of 17%, underscoring the tremendous growth and ongoing investment potential in the ABF supply chain.

Currently, Taiwanese and Japanese manufacturers cover about 80% of the global ABF substrate capacity. Major players like Japan’s Ibiden, Shinko and AT&S, along with Taiwan’s Unimicron, Nan Ya, and Kinsus all consider expanding their ABF substrate production capabilities as a long-term strategy.

As we analyzed in another piece, “Chiplet Design: A Real Game-Changer for Substrates,” despite the recent economic headwinds, capacity expansion of ABF substrate can still be seen as a solid trend, which is secured by the robust growth of high-end servers. Hence, the ability to precisely forecast capacity needs and simultaneously improve production yields will be the key to competitiveness for all substrate suppliers.

Read more:

(Photo Credit: Google)

  • Page 4
  • 4 page(s)
  • 17 result(s)

Get in touch with us