News
AMD benefited from AI demand last quarter (January to March), with revenue of USD 5.47 billion, surpassing Wall Street expectations and turning a profit compared to the same period last year. However, this quarter’s fiscal forecast and market outlook are not as expected.
AMD achieved a net profit of USD 120 million last quarter, with an adjusted EPS of USD 0.62, surpassing Wall Street’s expected USD 0.61. AMD expects revenue for this quarter to be between USD 5.4 billion and USD 6 billion, with a midpoint of USD 5.7 billion, a 6% increase from the same period last year but slightly below Wall Street’s expected USD 5.73 billion.
After enduring a downturn in the semiconductor industry, AMD finally returned to profitability last quarter, largely due to strong sales of its MI300 series AI chips, which drove revenue in the data center division to grow by 80% year-on-year to USD 2.3 billion.
As per a report from the Wall Street Journal, AMD CEO Lisa Su stated that since the launch of the latest MI300X chip at the end of last year, sales have surpassed $1 billion, with major customers including Microsoft, Meta, Oracle, among other tech giants.
In January, Lisa Su had forecasted that AMD’s AI chip revenue for this year could reach USD 3.5 billion, which was recently revised upwards to USD 4 billion. The AMD MI300 series chips are seen as direct competitors to NVIDIA’s H100 chips. However, NVIDIA announced its new generation AI chip architecture, Blackwell, in March this year, forcing AMD to accelerate its pace. Lisa Su stated that AMD is already developing the next generation of AI chips.
AMD’s client division, which sells PC chips, has also benefited from the AI wave, with revenue increasing by 85% year-on-year to USD 1.4 billion last quarter, once again proving the recovery and growth of the global PC market. AMD’s chips for PCs are capable of executing AI computations locally, targeting the increasingly expanding demand for AI-enabled PCs.
Regarding the applications of AI PCs, Su previously stated in an interview with Sina that she found communication, productivity, and creativity particularly exciting. Many applications are still in their early stages, but she expects to see more developments in the coming years.
However, AMD’s businesses outside of AI chips are facing increasing challenges. Revenue from the gaming console chip division declined by 48% year-on-year to USD 920 million last quarter, falling short of Wall Street’s expectations of USD 970 million. Additionally, the revenue from the embedded chip division, established after AMD’s acquisition of Xilinx in 2022, also decreased by 46% year-on-year to USD 850 million last quarter, similarly below Wall Street’s expectations of USD 940 million.
TrendForce previously issued an analysis in a press release, indicating that the AI PC market is propelled by two key drivers: Firstly, demand for terminal applications, mainly dominated by Microsoft through its Windows OS and Office suite, is a significant factor. Microsoft is poised to integrate Copilot into the next generation of Windows, making Copilot a fundamental requirement for AI PCs.
Secondly, Intel, as a leading CPU manufacturer, is advocating for AI PCs that combine CPU, GPU, and NPU architectures to enable a variety of terminal AI applications.
Read more
(Photo credit: AMD)
News
According to a report from Korean media outlet viva100, Samsung has signed a new USD 3 billion agreement with processor giant AMD to supply HBM3e 12-layer DRAM for use in the Instinct MI350 series AI chips. Reportedly, Samsung has also agreed to purchase AMD GPUs in exchange for HBM products, although details regarding the specific products and quantities involved remain unclear.
Earlier market reports indicated that AMD plans to launch the Instinct MI350 series in the second half of the year as an upgraded version of the Instinct MI300 series. The MI350 series is reportedly expected to adopt TSMC’s 4-nanometer process, delivering improved computational performance with lower power consumption. The inclusion of 12-layer stacked HBM3e memory will enhance both bandwidth and capacity.
In October 2023, at Samsung Memory Tech Day 2023, Samsung announced the launch of a new HBM3e codenamed “Shinebolt.” In February of this year, Samsung unveiled the industry’s first HBM3e 12H DRAM, featuring 12 layers and a capacity of 36GB, marking the highest bandwidth and capacity HBM product to date. Samsung has provided samples and plans to commence mass production in the second half of the year.
Samsung’s HBM3e 12H DRAM offers up to 1280GB/s bandwidth and 36GB capacity, representing a 50% increase compared to the previous generation of eight-layer stacked memory. Advanced Thermal Compression Non-Conductive Film (TC NCF) technology enables the 12-layer stack to meet HBM packaging requirements while maintaining chip height consistency with eight-layer chips.
Additionally, optimizing the size of chip bumps improves HBM thermal performance, with smaller bumps located in signal transmission areas and larger bumps in heat dissipation areas, contributing to higher product yields.
The adoption of HBM3e 12-layer DRAM over HBM3e 8-layer DRAM has shown an average speed improvement of 34% in AI applications, with inference service users increasing by over 11.5 times.
In view of this matter, industry sources cited by the report from TechNews has indicated that this deal is separate from negotiations between AMD and Samsung Foundry for wafer production. AMD plans to assign a portion of new CPUs/GPUs to Samsung for manufacturing, which is unrelated to this specific transaction.
Read more
(Photo credit: Samsung)
News
Following the announcements of AI PC processor chips by NVIDIA and Intel, AMD has also entered the fray by unveiling the Ryzen Pro 8040 and Ryzen Pro 8000 series chips. According to a report from Commercial Times, they will be manufactured by TSMC and are expected to be released in the second half of the year.
On April 16th, AMD announced that the Ryzen Pro 8040 series chip, designed for laptops, and the Ryzen Pro 8000 series chip, designed for desktops, are the most advanced commercial PC chips ever created. These chips will be manufactured using 4-nanometer technology. It is expected that new AI PCs from HP and Lenovo, starting in the second half of this year, will incorporate these two major chip series.
“AI PC” are laptops and desktops capable of directly running real-time language translation and other AI applications locally, as opposed to most PC devices on the market that rely on cloud platforms for AI computations.
Intel previously announced the Core Ultra series chips designed specifically for AI PCs last year and claimed that the first batch of over 230 AI PCs globally will feature this series of chips. Collaborating partners include major PC manufacturers such as Acer, ASUS, Dell, HP, and Lenovo. Gelsinger once expressed at a New York launch event in 2023 that they anticipate the AI PC to be the standout performer in the coming year.
NVIDIA also unveiled a new generation of AI chips in January of this year, claiming they can execute AI applications directly on PC. Collaborating partners include Acer, Dell, and Lenovo.
Similarly targeting the AI PC market, AMD previously announced the Ryzen 8000G series chips designed specifically for desktops in January. Previously, during an interview with Chinese media Sina, Lisa Su, CEO of AMD, indicated that AI-powered PCs will play a crucial role in driving the growth of the PC market this year.
As NVIDIA, Intel, and AMD vie for opportunities in the AI market, TSMC, which possesses the world’s most advanced chip manufacturing technology, emerges as the primary beneficiary. TSMC currently manufactures AI chips for NVIDIA using a 3-nanometer process and is expected to begin mass production of the next-generation 2-nanometer process starting from next year.
Read more
(Photo credit: AMD)
News
Amid escalating tensions in the US-China tech war, rumors cited in reports from The Wall Street Journal and CNBC suggest that China has instructed major local telecom companies to gradually replace foreign chips by 2027, with Intel and AMD as the primary targets.
Sources cited in the same reports reveal that China’s Ministry of Industry and Information Technology (MIIT) has instructed several major local telecom operators to phase out foreign chips used in core telecommunications infrastructure by 2027. This move is expected to impact both Intel and AMD. Regarding this matter, CNBC reports that Intel declined to comment on the report, AMD didn’t respond to a request for comment, either.
It has been reported that Chinese authorities have ordered state-owned telecom operators to inspect their networks for extensive use of non-Chinese manufactured chips and to replace them before the deadline.
In the past, China has attempted to reduce its reliance on foreign chips but has faced obstacles due to a lack of high-quality locally produced chips. However, telecom operators now have more local alternatives for procurement, suggesting that the quality of Chinese-made chips may have become more stable and reliable.
Sources cited in the same reports indicate that this move will have the most significant impact on Intel and AMD, as most of the core processors used in Chinese and global networking equipment come from these two tech giants. However, the exact extent of the impact is still unknown.
On the other hand, a previous report from the Financial Times also indicated that, to refrain from using PCs and servers equipped with microprocessors from Intel and AMD, China implemented new regulations in December of last year requiring government agencies at the county level and above.
Read more
(Photo credit: iStock)
News
The global laptop and PC market is experiencing a gradual recovery, driven by the growing trend of AI-powered PCs (AIPC). Consequently, as per a report from TechNews, the competition to enhance AI chip computing power has emerged as a key global focus.
One of the competitors, Intel, during its Vision 2024 event, showcased its next-generation laptop chip, Lunar Lake. Intel CEO Pat Gelsinger stated that this chip will deliver over 100 TOPS (trillion operations per second) of AI performance, with the NPU alone contributing 45 TOPS. This marks a threefold increase in AI performance compared to Intel’s current generation of chips and meets the 45 TOPS NPU performance threshold previously set by Intel for the next generation of AI PCs.
Currently, Intel’s Meteor Lake processor NPU can only deliver 10 TOPS, which falls short of the standard required for the next generation of AI PCs. However, the NPU performance of Lunar Lake precisely meets the 45 TOPS standard.
Pat Gelsinger did not provide detailed breakdowns of the remaining 55+ TOPS performance between the CPU and GPU, but it can be reasonably speculated that the GPU contributes around 50 TOPS, while the CPU cores contribute 5-10 TOPS.
As for Intel’s competitors, AMD’s current-generation Ryzen Hawk Point platform offers NPU performance of 16 TOPS, which is also below Intel’s envisioned standard for the next generation of AI PCs.
However, AMD has recently indicated that their next-generation products will make significant breakthroughs to meet the demands of AI computing, incorporating a robust architecture with powerful CPU, GPU, and NPU components. This design philosophy has been consistent for AMD from the Ryzen 7040 series to the current 8040 series.
At an AI event in December last year, AMD unveiled the next-generation Ryzen Strix Point mobile processor featuring the XDNA 2 architecture, boasting a threefold increase in AI performance compared to the previous generation.
Yet, AMD has not provided detailed performance allocations for each component. Nonetheless, a simple calculation suggests that if the NPU performance triples, then the NPU performance of Ryzen Strix Point would reach 48 TOPS.
Qualcomm’s Snapdragon X Elite platform represents another competitor in the escalating competition, with chips based on ARM-based architecture scheduled to launch in mid-2024. Qualcomm has stated that its NPU performance will reach 45 TOPS, further heightening the competition among Intel, AMD, and Qualcomm for dominance in the next generation of AI computing.
Read more
(Photo credit: Intel)