Google


2024-05-27

[News] Google Reportedly Shifts Orders from Samsung to TSMC for Its Upcoming Tensor G5 in the Pixel 10 Series

Google has reportedly collaborated with TSMC on the upcoming Tensor G5 chip, slated for use in the Pixel 10 series smartphone to be released next year, according to media outlet Android Authority, based on information it spotted in trade databases.

Google has been cooperating with Samsung on its self-developed Tensor processors since 2021, including the Tensor G4 used in the Pixel 9.

The US tech giant’s latest strategic move is reportedly making Tensor G5 the first Google smartphone chip not produced by Samsung.

According to industry insiders cited by the aforementioned report, despite Google’s relatively low smartphone market share, the act would signify TSMC’s leading position in advanced nodes, and is expected to foster closer collaboration between the two companies in the future.

According to the market share data released by Trendforce in March, in 4Q 2023, Apple ranked as 1st in global smartphone production, with a 23.3% market share, while Samsung (15.9%) and Xiaomi (12.8%) ranked as 2nd and 3rd, respectively. Google, on the other hand, has not made it to the top six.

Regarding other major smartphone players’ product roadmaps next year, in addition to Google’s Pixel 10, Apple is also rumored to cooperate with TSMC on the A19 Pro chip in the iPhone 17 Pro and iPhone 17 Pro Max, based on a previous report from Wccftech.

Samsung, on the other hand, is reportedly planning to use its 2nm process on the latest Exynos 2600 chip, which is expected to start mass production in 2025, and be used in the Galaxy S26 series smartphone, according to a previous report by the Korea media outlet ET News.

Read more

(Photo credit: Google)

Please note that this article cites information from Android Authority and Wccftech.
2024-05-22

[News] Google to Add EUR 1 Billion for AI Business in Reaction to CSPs’ Strong Demands

On May 20, a report by Reuters revealed that Google plans to invest an additional Euro 1 billion in its data center park in Finland. This move aims to expand the scale and boost its AI business growth in Europe.

The report notes that in recent years, many data centers have been established in Nordic countries due to the cool climate, tax incentives, and ample supply of renewable energy. Finland’s wind power capacity has seen significant growth over these years, up by 75% to 5,677 megawatts by 2022, which brings electricity prices even down to negative values on particularly windy days.

Thus, Data center operators like Google have been taken advantage of this renewable energy, and already signed long-term wind power purchase agreements in Finland.

Driven by the AI wave, cloud providers such as Microsoft, Google, Meta, and Amazon have an increasingly robust demand for AI servers and data centers.

According to a previous forecast by TrendForce, considering the global CSPs’ demands for high-end AI servers (Those equipped with NVIDIA, AMD, or other high-end ASIC chips included) in 2024, the demands from four major U.S. CSPs: Microsoft, Google, AWS, and Meta are expected to account for 20.2%, 16.6%, 16%, and 10.8% of global demand respectively, reigning over the global market with a total proportion of more than 60%.

Read more

(Photo credit: Google)

Please note that this article cites information from WeChat account DRAMeXchange and Reuters.

2024-05-15

[News] Google Unveils 6th Generation TPU, Scheduled to Launch Later This Year

At the Google I/O 2024 developer conference on Tuesday, Google unveiled its 6th generation custom chip, the Trillium TPU, which is scheduled to hit the market later this year, according to the report by TechCrunch.

According to the information provided by Google on its website, compared to TPU v5e, Trillium boasts a 4.7x peak compute performance increase per chip. Google has also doubled the High Bandwidth Memory (HBM) capacity and bandwidth, along with a 1x increase in Interchip Interconnect (ICI) bandwidth between chips.

Additionally, Trillium features the third-generation SparseCore, a dedicated accelerator for processing large embeddings, aimed at handling advanced ranking and recommendation workloads. Moreover, Trillium achieves a 67% higher energy efficiency compared to TPU v5e.

Trillium has the capacity to expand up to 256 TPUs within a singular pod boasting high bandwidth and low latency. Additionally, it incorporates multislice technology, allowing Google to interlink thousands of chips, thus constructing a supercomputer capable of facilitating a data center network capable of processing petabits of data per second.

In addition to Google, major cloud players such as AWS, Meta, and Microsoft have also made their way to develop their own AI Chips.

In late 2023, Microsoft unveiled two custom-designed chips, the Microsoft Azure Maia AI Accelerator, optimized for AI tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor tailored to run general purpose compute workloads on the Microsoft Cloud. The former is reportedly to be manufactured using TSMC’s 5nm process.

In May 2023, Meta also unveiled the Meta Training and Inference Accelerator (MTIA) v1, its first-generation AI inference accelerator designed in-house with Meta’s AI workloads in mind.

AWS has also jumped into the AI chip market. In November, 2023, AWS released Trainium2, a chip for training AI models.

Read more

(Photo credit: Google)

Please note that this article cites information from TechNews.

2024-05-08

[News] Rise of In-House Chips: 5 Tech Giants In the Front

With the skyrocketing demand for AI, cloud service providers (CSPs) are hastening the development of in-house chips. Apple, making a surprising move, is actively developing a data center-grade chip codenamed “Project ACDC,” signaling its foray into the realm of AI accelerators for servers.

As per a report from global media The Wall Street Journal, Apple is developing an AI accelerator chip for data center servers under the project name “Project ACDC.” Sources familiar with the matter revealed that Apple is closely collaborating with TSMC, but the timing of the new chip’s release remains uncertain.

Industry sources cited by the same report from Commercial Times disclosed that Apple’s AI accelerator chip will be developed using TSMC’s 3-nanometer process. Servers equipped with this chip are expected to debut next year, further enhancing the performance of its data centers and future cloud-based AI tools.

Industry sources cited in Commercial Times‘ report reveal that cloud service providers (CSPs) frequently choose TSMC’s 5 and 7-nanometer processes for their in-house chip development, capitalizing on TSMC’s mature advanced processes to enhance profit margins. Additionally, the same report also highlights that major industry players including Microsoft, AWS, Google, Meta, and Apple rely on TSMC’s advanced processes and packaging, which significantly contributes to the company’s performance.

Apple has consistently been an early adopter of TSMC’s most advanced processes, relying on their stability and technological leadership. Apple’s adoption of the 3-nanometer process and CoWoS advanced packaging next year is deemed the most reasonable solution, which will also help boost TSMC’s 3-nanometer production capacity utilization.

Generative AI models are rapidly evolving, enabling businesses and developers to address complex problems and discover new opportunities. However, large-scale models with billions or even trillions of parameters pose more stringent requirements for training, tuning, and inference.

Per Commercial Times citing industry sources, it has noted that Apple’s entry into the in-house chip arena comes as no surprise, given that giants like Google and Microsoft have long been deploying in-house chips and have successively launched iterative products.

In April, Google unveiled its next-generation AI accelerator, TPU v5p, aimed at accelerating cloud-based tasks and enhancing the efficiency of online services such as search, YouTube, Gmail, Google Maps, and Google Play Store. It also aims to improve execution efficiency by integrating cloud computing with Android devices, thereby enhancing user experience.

At the end of last year, AWS introduced two in-house chips, Graviton4 and Trainium2, to strengthen energy efficiency and computational performance to meet various innovative applications of generative AI.

Microsoft also introduced the Maia chip, designed for processing OpenAI models, Bing, GitHub Copilot, ChatGPT, and other AI services.

Meta, on the other hand, completed its second-generation in-house chip, MTIA, designed for tasks related to AI recommendation systems, such as content ranking and recommendations on Facebook and Instagram.

Read more

(Photo credit: Apple)

Please note that this article cites information from The Wall Street Journal and Commercial Times.

2024-05-06

[Insights] Big Four CSPs Continue to Shine in Q1 2024 Financial Reports, AI Returns Garnering Attention

Four major cloud service providers (CSPs) including Google, Microsoft, Amazon, and Meta, sequentially released their first-quarter financial performance for the year 2024 (January 2024 to March 2024) at the end of April.

Each company has achieved double-digit growth of the revenue, with increased capital expenditures continuing to emphasize AI as their main development focus. The market’s current focus remains on whether AI investment projects can successfully translate into revenue from the previous quarter to date.

TrendForce’s Insights:

1. Strong Financial Performance of Top Four CSPs Driven by AI and Cloud Businesses

Alphabet, the parent company of Google, reported stellar financial results for the first quarter of 2024. Bolstered by growth in search engine, YouTube, and cloud services, revenue surpassed USD 80 billion, marking a 57% increase in profit. The company also announced its first-ever dividend payout, further boosting its stock price as all metrics exceeded market expectations, pushing its market capitalization past USD 2 trillion for the first time.For Google, the current development strategy revolves around its in-house LLM Gemini layout, aimed at strengthening its cloud services, search interaction interfaces, and dedicated hardware development.

Microsoft’s financial performance is equally impressive. This quarter, its revenue reached USD 61.9 billion, marking a year-on-year increase of 17%. Among its business segments, the Intelligent Cloud sector saw the highest growth, with a 21% increase in revenue, totaling $26.7 billion. Notably, the Azure division experienced a remarkable 31% growth, with Microsoft attributing 7% of this growth to AI demand.

In other words, the impact of AI on its performance is even more pronounced than in the previous quarter, prompting Microsoft to focus its future strategies more on the anticipated benefits from Copilot, both in software and hardware.

This quarter, Amazon achieved a remarkable revenue milestone, surpassing USD 140 billion, representing a year-on-year increase of 17%, surpassing market expectations. Furthermore, its profit reached USD 10.4 billion, far exceeding the USD 3.2 billion profit recorded in the same period in 2023.

The double-digit growth in advertising business and AWS (Amazon Web Services) drove this performance, with the latter being particularly highlighted for its AI-related opportunities. AWS achieved a record-high operating profit margin of 37.6% this quarter, with annual revenue expected to exceed $100 billion, and short-term plans to invest USD 150 billion in expanding data centers.

On the other hand, Meta reported revenue of USD 36.46 billion this quarter, marking a significant year-on-year growth of 27%, the largest growth rate since 2021. Profit also doubled compared to the same period in 2023, reaching USD 12.37 billion.

Meta’s current strategy focuses on allocating resources to areas such as smart glasses and mixed reality (MR) in the short and medium term. The company continues to leverage AI to enhance the user value of the virtual world.

2. Increased Capital Expenditure to Develop AI is a Common Consensus, Yet Profitability Remains Under Market Scrutiny

Observing the financial reports of major cloud players, the increase in capital expenditure to solidify their commitment to AI development can be seen as a continuation of last quarter’s focus.

In the first quarter of 2024, Microsoft’s capital expenditure surged by nearly 80% compared to the same period in 2023, reaching USD 14 billion. Google expects its quarterly expenditure to remain above USD 12 billion. Similarly, Meta has raised its capital expenditure guidance for 2024 to the range of USD 35 to USD 40 billion.

Amazon, considering its USD 14 billion expenditure in the first quarter as the minimum for the year, anticipates a significant increase in capital expenditure over the next year, exceeding the USD 48.4 billion spent in 2023. However, how these increased investments in AI will translate into profitability remains a subject of market scrutiny.

While the major cloud players remain steadfast in their focus on AI, market expectations may have shifted. For instance, despite impressive financial reports last quarter, both Google and Microsoft saw declines in their stock prices, unlike the significant increases seen this time. This could partly be interpreted as an expectation of short- to medium-term AI investment returns from products and services like Gemini and Copilot.

In contrast, Meta, whose financial performance is similarly impressive to other cloud giants, experienced a post-earnings stock drop of over 15%. This may be attributed partly to its conservative financial outlook and partly to the less-than-ideal investment returns from its focused areas of virtual wearable devices and AI value-added services.

Due to Meta’s relatively limited user base compared to the other three CSPs in terms of commercial end-user applications, its AI development efforts, such as the practical Llama 3 and the value-added Meta AI virtual assistant for its products, have not yielded significant benefits. While Llama 3 is free and open-source, and Meta AI has limited shipment, they evidently do not justify the development costs.

Therefore, Meta still needs to expand its ecosystem to facilitate the promotion of its AI services, aiming to create a business model that can translate technology into tangible revenue streams.

For example, Meta recently opened up the operating system Horizon OS of its VR device Quest to brands like Lenovo and Asus, allowing them to produce their own branded VR/MR devices. The primary goal is to attract developers to enrich the content database and thereby promote industry development.

Read more

  • Page 4
  • 9 page(s)
  • 43 result(s)

Get in touch with us