Nvidia


2024-05-09

[COMPUTEX 2024] The Rise of Generative AI Sparks Innovation across Industries, with Taiwan-based Companies Leading as Essential Partners in the Global Supply Chain

“The Dawn of Generative AI Has Come!” This new chapter in the course of human technological evolution was first introduced by NVIDIA’s founder, Jensen Huang. Qualcomm’s CEO, Cristiano Amon, also shares this optimism regarding generative AI. Amon believes this technology is rapidly evolving and being adopted for applications such as mobile devices. It is expected to have the potential to radically transform the landscape of the smartphone industry. Similarly, Intel has declared the arrival of the “AI PC” era, signaling a major shift in computing-related technologies and applications.

COMPUTEX 2024, the global showcase of AIoT and startup innovations, will run from June 4th to June 7th. This year’s theme, ‘Connecting AI’, aligns perfectly with the article’s focus on the transformative power of Generative AI and Taiwan’s pivotal role in driving innovation across industries.

This year, AI is transitioning from cloud computing to on-premise computing. Various “AI PCs” and “AI smartphones” are being introduced to the market, offering a wide range of selections. The current year of 2024 is even being referred to as the “Year of AI PC,” with brands such as Asus, Acer, Dell, Lenovo, and LG actively releasing new products to capture market share. With the rapid rise of AI PCs and AI smartphones, revolutionary changes are expected to occur in workplaces and people’s daily lives. Furthermore, the PC and smartphone industries are also expected to be reinvigorated with new sources of demand.

An AI PC refers to a laptop (notebook) computer capable of performing on-device AI computations. Its main difference from regular office or business laptops lies in its CPU, which includes an additional neural processing unit (NPU). Examples of AI CPUs include Intel’s Core Ultra series and AMD’s Ryzen 8040 series. Additionally, AI PCs come with more DRAM to meet the demands of AI computations, thereby supporting related applications like those involving machine learning.

Microsoft’s role is crucial in this context, as the company has introduced a conversational AI assistant called “Copilot” that aims to seamlessly integrate itself into various tasks, such as working on Microsoft Office documents, video calls, web browsing, and other forms of collaborative activities. With Copilot, it is now possible to add a direct shortcut button for AI on the keyboard, allowing PC users to experience a holistic collaborative relationship with AI.

In the future, various computer functions will continue to be optimized with AI. Moreover, barriers that existed for services such as ChatGPT, which still require an internet connection, are expected to disappear. Hence, AI-based apps on PCs could one day be run offline. Such a capability is also one of the most eagerly awaited features among PC users this year.

Surging Development of LLMs Worldwide Has Led to a Massive Increase in AI Server Shipments

AI-enabled applications are not limited to PCs and smartphones. For example, an increasing number of cloud companies have started providing services that leverage AI in various domains, including passenger cars, household appliances, home security devices, wearable devices, headphones, cameras, speakers, TVs, etc. These services often involve processing voice commands and answering questions using technologies like ChatGPT. Going forward, AI-enabled applications will become ubiquitous in people’s daily lives.

Not to be overlooked is the fact that, as countries and multinational enterprises continue to develop their large language models (LLMs), the demand for AI servers will increase and thus promote overall market growth. Furthermore, edge AI servers are expected to become a major growth contributor in the future as well. Small-sized businesses are more likely to use LLMs that are more modest in scale for various applications. Therefore, they are more likely to consider adopting lower-priced AI chips that also offer excellent cost-to-performance ratios.

TrendForce projects that shipments of AI servers, including models equipped with GPUs, FPGAs, and ASICs, will reach 1.655 million units in 2024, marking a growth of 40.2% compared with the 2023 figure. Furthermore, the share of AI servers in the overall server shipments for 2024 is projected to surpass 12%.

Regarding the development of AI chips in the current year of 2024, the focus is on the competition among the B100, MI300, and Gaudi series respectively released by NVIDIA, AMD, and Intel. Apart from these chips, another significant highlight of this year is the emergence of in-house designed chips or ASICs from cloud service providers.

In addition to AI chips, the development of AI on PCs and smartphones is certainly another major driving force behind the technology sector in 2024. In the market for CPUs used in AI PCs, Intel’s Core Ultra series and AMD’s Ryzen 8000G series are expected to make a notable impact. The Snapdragon X Elite from Qualcomm has also garnered significant attention as it could potentially alter the competitive landscape in the near future.

Turning to the market for SoCs used in AI smartphones, the fierce competition between Qualcomm’s Snapdragon 8 Gen 3 and MediaTek’s Dimensity 9300 series is a key indicator. Another development that warrants attention is the adoption of AI chips in automotive hardware, such as infotainment systems and advanced driver assistance systems. The automotive market is undoubtedly one of the main battlegrounds among chip suppliers this year.

The supply chain in Taiwan has played a crucial role in providing the hardware that supports the advancement of AI-related technologies. When looking at various sections of the AI ecosystem, including chip manufacturing as well as the supply chains for AI servers and AI PCs, Taiwan-based companies have been important contributors.

Taiwan-based Companies in the Supply Chain Stand Ready for the Coming Wave of AI-related Demand

In the upstream of the supply chain, semiconductor foundries and OSAT providers such as TSMC, UMC, and ASE have always been key suppliers. As for ODMs or OEMs, companies including Wistron, Wiwynn, Inventec, Quanta, Gigabyte, Supermicro, and Foxconn Industrial Internet have become major participants in the supply chains for AI servers and AI PCs.

In terms of components, AI servers are notable for having a power supply requirement that is 2-3 times greater than that of general-purpose servers. The power supply units used in AI servers are also required to offer specification and performance upgrades. Turning to AI PCs, they also have higher demands for both computing power and energy consumption. Therefore, advances in the technologies related to power supply units represent a significant indicator this year with respect to the overall development of AI servers and AI PCs. Companies including Delta Electronics, LITE-ON, AcBel Polytech, CWT, and Chicony are expected to make important contributions to the upgrading and provisioning of power supply units.

Also, as computing power increases, heat dissipation has become a pressing concern for hardware manufacturers looking to further enhance their products. The advancements in heat dissipation made by solution providers such as Sunon, Auras, AVC, and FCN during this year will be particularly noteworthy.

Besides the aforementioned companies, Taiwan is also home to numerous suppliers for other key components related to AI PCs. The table below lists notable component providers operating on the island.

With the advent of generative AI, the technology sector is poised for a boom across its various domains. From AI PCs to AI smartphones and a wide range of smart devices, this year’s market for electronics-related technologies is characterized by diversity and innovation. Taiwan’s supply chain plays a vital role in the development of AI PCs and AI servers, including chips, components, and entire computing systems. As competition intensifies in the realm of LLMs and AI chips, this entire market is expected to encounter more challenges and opportunities.

Join the AI grand event at Computex 2024, alongside CEOs from AMD, Intel, Qualcomm, and ARM. Discover more about this expo! https://bit.ly/44Gm0pK

(Photo credit: Qualcomm)

2024-05-07

[News] South Korea Reportedly Develops AI Chips for Autonomous Vehicles, Challenging NVIDIA

Recently, a report from South Korean media outlet BusinessKorea has indicated that the South Korean government is actively advancing new research and development (R&D) projects, including the development of AI chips for autonomous vehicles, with the aim of surpassing the American semiconductor giant NVIDIA.

The report stated that on May 2nd, the South Korean Ministry of Trade, Industry, and Energy announced that the“Second Strategic Planning and Investment Council,” comprising of representatives from research institutes, universities, etc, approved 62 new R&D projects for 2025, including flagship projects and roadmaps in over 11 domains.

The council prioritizes investments in high-end strategic industries to achieve technological sovereignty and breakthrough growth, while also increasing funding for innovative research that undertakes the risk of failure. It ceases subsidies to individual companies and instead focuses on investments centered around core technologies shared across industries, such as artificial intelligence and compliance with global environmental regulations.

Following this investment strategy, the review council has selected 62 projects. Among them, 12 flagship projects are designed to be world-first and best-in-class, aiming to seize the opportunity of next-generation technologies.

In line with this, the review council plans to develop a universal, open next-generation artificial intelligence chip for Software-Defined Vehicles (SDVs), with a processing speed of up to 10 trillion operations per second (TOPS).

Currently, NVIDIA is advancing the development and commercialization of its next-generation autonomous driving chip rated at 1,000 TOPS. Meanwhile, South Korea is developing autonomous driving chips with performance ranging from tens to 300 TOPS.

The Ministry’s goal is to develop the world’s first commercially viable high-speed autonomous driving vehicle network system and a core semiconductor with a processing speed of 10 gigabits per second (Gbps), enabling full Level 4 and above autonomous driving.

Read more

(Photo credit: Pixabay)

Please note that this article cites information from BusinessKorea.

2024-05-06

[News] TSMC’s Advanced Packaging Capacity Fully Booked by NVIDIA and AMD Through Next Year

With the flourishing of AI applications, two major AI giants, NVIDIA and AMD, are fully committed to the high-performance computing (HPC) market. It’s reported by the Economic Daily News that they have secured TSMC’s advanced packaging capacity for CoWoS and SoIC packaging through this year and the next, bolstering TSMC’s AI-related business orders.

TSMC holds a highly positive outlook on the momentum brought by AI-related applications. During the April earnings call, CEO C.C. Wei revised the visibility of AI orders and their revenue contribution, extending the visibility from the original expectation of 2027 to 2028.

TSMC anticipates that revenue contribution from server AI processors will more than double this year, accounting for a low-teens percentage of the company’s total revenue in 2024. It also expects a 50% compound annual growth rate for server AI processors over the next five years, with these processors projected to contribute over 20% to TSMC’s revenue by 2028.

Per the industry sources cited by the same report from Economic Daily News, they have indicated that the strong demand for AI has led to a fierce competition among the four global cloud service giants, including Amazon AWS, Microsoft, Google, and Meta, to bolster their AI server arsenal. This has resulted in a supply shortage for AI chips from major manufacturers like NVIDIA and AMD.

Consequently, these companies have heavily invested in TSMC’s advanced process and packaging capabilities to meet the substantial order demands from cloud service providers. TSMC’s advanced packaging capacity, including CoWoS and SoIC, for 2024 and 2025 has been fully booked.

To address the massive demand from customers, TSMC is actively expanding its advanced packaging capacity. Industry sources cited by the report have estimated that by the end of this year, TSMC’s CoWoS monthly capacity could reach between 45,000 to 50,000 units, representing a significant increase from the 15,000 units in 2023. By the end of 2025, CoWoS monthly capacity is expected to reach a new peak of 50,000 units.

Regarding SoIC, it is anticipated that the monthly capacity by the end of this year could reach five to six thousand units, representing a multiple-fold increase from the 2,000 units at the end of 2023. Furthermore, by the end of 2025, the monthly capacity is expected to surge to a scale of 10,000 units.

It is understood that NVIDIA’s mainstay H100 chip currently in mass production utilizes TSMC’s 4-nanometer process and adopts CoWoS advanced packaging. Additionally, it supplies customers with SK Hynix’s High Bandwidth Memory (HBM) in a 2.5D packaging form.

As for NVIDIA’s next-generation Blackwell architecture AI chips, including the B100, B200, and the GB200 with Grace CPU, although they also utilize TSMC’s 4-nanometer process, they are produced using an enhanced version known as N4P. The production for the B100, per a previous report from TechNews, is slated for the fourth quarter of this year, with mass production expected in the first half of next year.

Additionally, they are equipped with higher-capacity and updated specifications of HBM3e high-bandwidth memory. Consequently, their computational capabilities will see a multiple-fold increase compared to the H100 series.

On the other hand, AMD’s MI300 series AI accelerators are manufactured using TSMC’s 5-nanometer and 6-nanometer processes. Unlike NVIDIA, AMD adopts TSMC’s SoIC advanced packaging to vertically integrate CPU and GPU dies before employing CoWoS advanced packaging with HBM. Hence, the production process involves an additional step of advanced packaging complexity with the SoIC process.

Read more

(Photo credit: TSMC)

Please note that this article cites information from Economic Daily News and TechNews.

2024-05-03

[News] NVIDIA Reportedly Fueling Samsung and SK Hynix Competition, Impacting HBM Pricing?

According to South Korean media outlet BusinessKorea’s report on May 2nd, NVIDIA is reported to be fueling competition between Samsung Electronics and SK Hynix, possibly in an attempt to lower the prices of High Bandwidth Memory (HBM).

The report on May 2nd has cited sources, indicating that the prices of the third-generation “HBM3 DRAM” have soared more than fivefold since 2023. For NVIDIA, the significant increase in the pricing of critical component HBM is bound to affect research and development costs.

The report from BusinessKorea thus accused that NVIDIA is intentionally leaking information to fan current and potential suppliers to compete against each other, aiming to lower HBM prices. On April 25th, SK Group Chairman Chey Tae-won traveled to Silicon Valley to meet with NVIDIA CEO Jensen Huang, potentially related to these strategies.

Although NVIDIA has been testing Samsung’s industry-leading 12-layer stacked HBM3e for over a month, it has yet to indicate willingness to collaborate. BusinessKorea’s report has cited sources, suggesting this is a strategic move aimed at motivate Samsung Electronics. Samsung only recently announced that it will commence mass production of 12-layer stacked HBM3e starting from the second quarter.

SK Hynix CEO Kwak Noh-Jung announced on May 2nd that the company’s HBM capacity for 2024 has already been fully sold out, and 2025’s capacity is also nearly sold out.  He mentioned that samples of the 12-layer stacked HBM3e will be sent out in May, with mass production expected to begin in the third quarter.

Kwak Noh-Jung further pointed out that although AI is currently primarily centered around data centers, it is expected to rapidly expand to on-device AI applications in smartphones, PCs, cars, and other end devices in the future. Consequently, the demand for memory specialized for AI, characterized by “ultra-fast, high-capacity and low-power,” is expected to skyrocket.

Kwak Noh-Jung also addressed that SK Hynix possesses industry-leading technological capabilities in various product areas such as HBM, TSV-based high-capacity DRAM, and high-performance eSSD. In the future, SK Hynix looks to provide globally top-tier memory solutions tailored to customers’ needs through strategic partnerships with global collaborators.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from BusinessKorea.

2024-04-30

[News] Luxshare Reportedly Enters NVIDIA’s Chain, Eyeing at AI Chip Business

According to a report from Economic Daily News, Luxshare, a crucial player in the Chinese Apple supply chain, is said to be entering NVIDIA’s supply chain for the GB200, as it has announced the development of various components tailored for NVIDIA’s GB200 AI servers.

These components encompass connector, power-related items, and cooling products. The sources cited by the same report have noted that Luxshare’s focus areas align closely with Taiwanese expertise, setting the stage for another direct showdown with Taiwanese manufacturers.

Luxshare, previously not prominent in the server domain, has now reportedly made its move into NVIDIA’s top-tier AI products, attracting market attention. Especially given Luxshare’s swift entry into the iPhone supply chain previously, aggressively competing for orders with Taiwanese Apple suppliers.

As per the same report, Luxshare has revealed in its investor conference records that it has developed solutions corresponding to the NVIDIA GB200 AI server architecture, including products for electrical connection, optical connection, power management, and cooling. The company is reportedly said to be expected to offer solutions priced at approximately CNY 2.09 million and anticipates that the total market size will reach hundreds of billions of CNY.

If Luxshare adopts a similar strategy of leveraging its latecomer advantage in entering the NVIDIA AI supply chain, it will undoubtedly encounter intense competition.

Industry sources cited by the report also point out that Luxshare’s claim to supply components for NVIDIA’s GB200 is in areas where Taiwanese suppliers excel.

For instance, while connector is Luxshare’s core business, Taiwanese firms like JPC Connectivity and Lintes Tech also serve as suppliers of connectors for NVIDIA’s GB200 AI servers. They are poised to compete directly with Luxshare in the future.

In terms of power supply, Delta Electronics leverages its expertise in integrating power, cooling, and passive components to provide a comprehensive range of AI power integration solutions, from the grid to the chip. They cater to orders for power supplies for NVIDIA’s Blackwell architecture series B100, B200, and GB200 servers, and will also compete with Luxshare in the future.

When it comes to thermal management, Asia Vital Components and Auras Technology are currently the anticipated players in the market, and they are also poised to compete with Luxshare.

Read more

(Photo credit: Luxshare)

Please note that this article cites information from Economic Daily News.

  • Page 20
  • 46 page(s)
  • 230 result(s)

Get in touch with us