News
On September 9, Indian tech blog PiunikaWeb cited a report from Tech & Leaks Zone, stating that rumors have hinting at Google’s preparation to exit Samsung Electronics’ wafer foundry business, Samsung Foundry, and switch to TSMC in 2025. The next two generations of Google’s custom Tensor processors are reportedly expected to use TSMC’s 3nm and 2nm processes, respectively.
As per the same report, Google’s Tensor G4 processor is being manufactured by Samsung Foundry using its 4nm process. However, the G4 offers only a slight upgrade compared to the Tensor G3 in the Pixel 8 smartphone, as the G4 continues to use Samsung’s older FO-PLP packaging technology instead of the newer FO-WLP packaging, which is more capable in preventing overheating.
On the other hand, Google’s Tensor G5, which will be used in the Pixel 10, is reportedly set to be manufactured by TSMC using the latest 3nm process and TSMC’s advanced InFO-POP packaging technology. The Tensor G6, which will support the Pixel 11 series, will also be produced by TSMC using 2nm process.
Notably, Apple had introduced an AI technical document in June, disclosed that two AI models supporting “Apple Intelligence” were trained in the cloud using Google’s custom-designed Tensor Processing Unit (TPU).
Per Google’s official website, the cost of using its most advanced TPU can be less than USD 2 per hour if reserved three years in advance. Google first introduced the TPU in 2015 for internal use, and it became available to the public in 2017.
Additionally, per a report from wccftech, Google’s ARM-based TPU v5p “Axion,” designed specifically for data centers, is also rumored to be manufactured using TSMC’s enhanced 3nm process, N3E.
Read more
(Photo credit: Google)
Insights
According to TrendForce’s latest memory spot price trend report, neither DRAM nor NAND spot prices show much momentum. Regarding DRAM, spot prices are still falling as Samsung is increasing the amount of reball DDR5 (D1Y) chips that come from decommissioned modules. As for NAND flash, the spot market persists in sluggishness this week, where spot traders are continuously lowering their quotations. Details are as follows:
DRAM Spot Price:
In the spot market, prices are still falling as Samsung is increasing the amount of reball DDR5 (D1Y) chips that come from decommissioned modules. Since these reball chips are second-hand and low-cost products, Samsung can achieve profitability. However, this is leading to a continuous decline in spot prices, thereby affecting confidence across the entire DRAM market and buyers’ sentiment. Most buyers are now cautious and unwilling to actively stock up. The average spot price of the mainstream chips (i.e., DDR4 1Gx8 2666MT/s) decreased by 0.10% from US$1.972 last week to US$1.970 this week.
NAND Flash Spot Price:
The spot market for NAND Flash persists in sluggishness this week, where spot traders are continuously lowering their quotations to alleviate their pressure from inventory, though overall transaction dynamics are maintained at a lackluster level without apparent signs of recovery under unimproved end demand. Spot price of 512Gb TLC wafers dropped by 3.45% this week, arriving at US$3.075.
News
After its 8-Hi HBM3e entered mass production in February, Micron officially introduced the 12-Hi HBM3e memory stacks on Monday, which features a 36 GB capacity, according to a report by Tom’s Hardware. The new products are designed for cutting-edge processors used in AI and high-performance computing (HPC) workloads, including NVIDIA’s H200 and B100/B200 GPUs.
It is worth noting that the achievement has made the US memory chip giant almost on the same page with the current HBM leader, SK hynix. Citing Justin Kim, president and head of the company’s AI Infra division at SEMICON Taiwan last week, another report by Reuters notes that SK hynix is set to begin mass production of its 12-Hi HBM3e chips by the end of this month.
Samsung, on the other hand, is said to have completed NVIDIA’s quality test for the shipment of 8-Hi HBM3e memory, while the company is still working on the verification of its 12-Hi HBM3e.
Micron’s 12-Hi HBM3e memory stacks, according to Tom’s Hardware, feature a 36GB capacity, a 50% increase over the previous 8-Hi models, which had 24GB. This expanded capacity enables data centers to handle larger AI models, such as Meta AI’s Llama 2, with up to 70 billion parameters on a single processor. In addition, this capability reduces the need for frequent CPU offloading and minimizes communication delays between GPUs, resulting in faster data processing.
According to Tom’s Hardware, in terms of performance, Micron’s 12-Hi HBM3e stacks deliver over 1.2 TB/s. Despite offering 50% more memory capacity than competing products, Micron’s HBM3e consumes less power than the 8-Hi HBM3e stacks.
Regarding the future roadmap of HBM, Micron is said to be working on its next-generation memory solutions, including HBM4 and HBM4e. These upcoming memory technologies are set to further enhance performance, solidifying Micron’s position as a leader in addressing the increasing demand for advanced memory in AI processors, such as NVIDIA’s GPUs built on the Blackwell and Rubin architectures, the report states.
Read more
(Photo credit: Micron)
News
According to a report from Korean media ZDNet Korea, Chinese memory manufacturers like CXMT (Changxin Memory Technologies) are aggressively expanding production, which could negatively affect profitability in the traditional DRAM market. Both Samsung and SK hynix are said to be closely monitoring these developments.
Established in 2016, CXMT has become China’s largest DRAM producer with government backing, focusing on traditional DRAM and preparing to enter the HBM market.
Reportedly, CXMT has rapidly increased its DRAM production capacity, from 70,000 wafers per month in 2022 to 120,000 in 2023, and is projected to reach 200,000 wafers this year.
CXMT’s main products include 17nm and 18nm DDR4 and LPDDR4, with its latest offerings being 12nm DDR5 and LPDDR5X, which the company is also developing. Its aggressive DRAM expansion could negatively impact sales and profits for Korean memory manufacturers.
According to TrendForce’s data, the spot price of 16Gb DDR4 increased from $3 in the second half of 2023 to $3.50 in the first half of this year, before falling back to $3.30 in the second half of 2024.
For DDR5, prices have increased from $4.20 in October 2023 to over $4.50 in the first half of this year, approaching $5 in the second half.
By the end of August, the price premium of DDR5 over DDR4 had surged to 53.9%, up significantly from 36.9% six months earlier.
Per a recent report from Nomura Securities cited by ZDNet Korea, the rapid expansion of Chinese companies is expected to negatively impact the memory industry’s profitability, necessitating preparations for potential disruptions. CXMT’s production now accounts for about 5% of the market, potentially influencing prices.
Read more
(Photo credit: CXMT)
News
No eternal allies. No perpetual enemies. The old proverb seems so true when it comes to the semiconductor industry, when the world’s top foundry, TSMC, announced the collaboration with its rival Samsung, the second largest foundry globally, on the development of HBM4, according to the reports by Korean media outlets the Korea Economic Daily and Business Korea. According to analysts cited by the Korea Economic Daily, it would mark their first partnership in the AI chip sector.
Citing the remarks of Lee Jung-bae, Head of the Memory Business Division at Samsung Electronics at SEMICON Taiwan, the reports note that in order to advance in HBM, Samsung is preparing over 20 customized solutions in collaboration with various foundry partners. However, Lee declined to comment on which specific foundry they were partnering with.
The answer has been revealed when on September 5th, Dan Kochpatcharin, Head of Ecosystem and Alliance Management at TSMC, confirmed that the two companies are working together on developing a buffer-less HBM4 chip.
According to Business Korea, buffer-less HBM is a product that eliminates the buffer used to prevent electrical issues and manage voltage distribution, which Samsung targets to introduce with HBM4. The innovation is expected to enhance power efficiency by 40% and reduce latency by 10% compared to existing models.
The reports note that the main consideration Samsung chooses to team up with TSMC would be the attempt to incorporate customized features requested by major clients like NVIDIA and Google.
Although Samsung can offer a full range of HBM4 services, including memory production, foundry, and advanced packaging, the company aims to utilize TSMC’s technology to attract more clients, according to sources cited by the reports.
The manufacturing process for HBM4 differs from previous generations, with the logic die, the component that functions as the brain of an HBM chip, may now be produced by foundry companies rather than memory manufacturers.
Earlier in April, SK hynix, the current HBM leader as well as Samsung’s biggest rival on memory, announced the partnership with TSMC on HBM4 development and next-generation packaging technology.
Though months later than SK hynix and Micron, Samsung’s 8-layer HBM3e has reportedly started shipments to NVIDIA. It targets to gain a competitive edge with its rival in HBM4, eyeing to enter mass production by late 2025.
Read more
(Photo credit: Samsung)