News
This year, increasing demand for ChatGPT, along with ongoing innovations in PC and server technologies, has driven a growing market preference for high-value DRAM chips such as HBM and DDR5. Memory giants are collectively and actively positioning themselves in the production of these products.
DDR5: Micron Unveils New Products, Samsung Plans Line Expansion
The current DDR5 process has advanced to 1β DRAM. In October, Micron announced the release of DDR5 memory based on 1β technology, boasting speeds of up to 7200 MT/s. This product is now shipping to all customers in the data centers and PC markets.
Recently, Micron introduced the 128GB DDR5 RDIMM memory, utilizing 32Gb chips. With speeds of up to 8000 MT/s, it is suitable for servers and workstations. It also employs Micron’s 1β technology, and achieves a 24% improvement in energy efficiency and a 16% reduction in latency. Micron plans to launch models with speeds of 4800 MT/s, 5600 MT/s, and 6400 MT/s in 2024, with a future model reaching 8000 MT/s.
On the other hand, memory giant Samsung is committed to increasing DDR5 production capacity. Reports suggest that Samsung is planning to expand the production of high-value DRAM, investing in the infrastructure for advanced DRAM and increasing R&D spending to solidify its long-term market dominance.
Samsung, report as per KED Global News, is internally considering expanding DDR5 production lines. Given the high value of DDR5 and its adoption in the PC and server markets, this year is essentially regarded as the “year of widespread DDR5 adoption.”
HBM: Expansion Trend Begins, Significant Revenue Growth Expected
Amid the AI boom, HBM continues to gain popularity with demand supply outpacing. To meet this demand, storage giants are actively expanding production.
Recent reports indicate that companies like Samsung are planning to increase HBM production by 2.5 times. Additionally, in early November, it was reported that Samsung, to expand HBM capacity, acquired certain buildings and equipment within the Samsung Display Cheonan Factory. Samsung plans to establish a new packaging line at Cheonan for large-scale HBM production, having spent 10.5 billion Korean won on the acquisition and planning additional investments ranging from 700 billion to 1 trillion Korean won.
Micron, on the other hand, announced the official activation of its Taiwan-based Taichung Fab on November 6th. This facility will integrate advanced probe and 3D- packaging test, producing HBM3E and other products to meet the growing demand in various applications such as AI, data centers, edge computing, and the cloud.
TrendForce indicates that HBM, a memory embedded in high-end AI chips, is primarily supplied by three major vendors: Samsung, SK Hynix, and Micron. With the AI trend driving demand for AI chips, demand for HBM is also expected to increase in 2023 and 2024, prompting manufacturers to ramp up HBM production.
Looking ahead to 2024, the supply of HBM is expected to improve significantly. In terms of specifications, as AI chips demand higher performance, the mainstream for HBM in 2024 is expected to shift to HBM3 and HBM3e. Overall, with increased demand and higher average selling prices for HBM3 and HBM3e compared to the previous generation, HBM revenue is expected to see significant growth in 2024.
(Image: Samsung)
Explore more
Insights
As the memory market faces oversupply and falling prices due to declining demand in 2023, there’s a glimmer of hope when looking into their Q4 guidance. Memory prices are gradually rising, indicating a potential escape from the market’s low point. The most recent financial reports from the world’s top five companies substantiate this positive outlook.
From the recent financial reports of Samsung, SK Hynix, Micron, Kioxia, and Western Digital reveal a slowdown in the rate of revenue loss despite some reporting losses. Some companies express optimism, noting a gradual recovery in certain downstream demand.
Samsung: Anticipating Q4 Demand Recovery
Samsung Electronics’ Q3 financial report shows a revenue of 6.74 trillion Korean won, a YoY decrease, but with a net profit exceeding expectations at 5.5 trillion won.
During their earnings call on October 31, Samsung highlighted the uncertainty in the recovery of the storage chip market. However, they remain optimistic about increased demand in Q4, driven by year-end promotions, new product releases from major clients, and growing demand for generative AI.
SK Hynix: Positive Signs in Market Conditions
SK Hynix’s report for the Q3 2023 fiscal year indicates improving market conditions, particularly due to increased demand for high-performance memory, especially in AI-related products. DRAM and NAND flash memory sales have grown, with a significant 20% QoQ increase in DRAM shipments. Rise of average prices also impacts the results. In the second half of the year, customers with reduced inventory are progressively increasing their procurement demands, leading to stable developments in product prices.
The company predicts continued improvement in the DRAM market and positive trends in NAND.
Micron: Storage Market Expected to Recover Next Year
Micron’s performance for the Q4 2023 fiscal year shows revenue of $4.01 billion, a 40% year-on-year decrease but better than market expectations. The DRAM business accounts for 69% of revenue, with $2.8 billion in revenue, an increase in bit shipments but a decrease in average selling price. NAND Flash revenue is $1.2 billion, with an increase in bit shipments but a decrease in ASP.
Micron expects Q1 revenue for the 2024 fiscal year to reach $4.2~4.6 billion, anticipating a recovery in the storage market in 2024 and further improvement in 2025.
Kioxia: Rebound in NAND Prices
Kioxia released its financial report for July to September 2023, with revenue of 241.4 billion yen, a 3.9% decrease QoQ and a 38.3% YoY decrease. Due to a decline in demand for smartphone and PC memory chips, the operating loss was 100.8 billion yen in the Q2. However, benefiting from the improvement in storage supply-demand balance, optimized storage portfolio, and the performance of the yen exchange rate, the operating loss has improved.
Although NAND shipments have decreased, the situation has improved due to the rebound in NAND prices. NAND bit shipments decreased by approximately 13%, and NAND ASP increased by about 8%. Looking ahead to 2024, Kioxia expects NAND prices to continue to rise with the original equipment company’s production reduction strategy and customer inventory normalization. Confidence in the NAND market’s recovery is expected, especially in data centers and enterprise SSD demand, after the first half of 2024.
Western Digital: Cloud Market Continues to Grow
Western Digital announced Q1 revenue for the 2024 fiscal year, totaling $2.75 billion, a 3% increase QoQ and a 26% YoY decrease. In the end market, the decline in flash memory prices was offset by the growth in flash memory shipments, driving some business growth on a QoQ basis.
CEO David Goeckeler stated that Q1 performance exceeded expectations, with profit margins for flash memory and HDD business continuously improving. He pointed out that the consumer and end-user markets performed well, and the cloud market is expected to continue growing. With market improvement, an improved cost structure enables the company to increase profitability.
Storage companies are adapting to the market by reducing capital expenditures and adjusting inventory, leading to a more normalized market inventory. Simultaneously, increased demand in AI servers, high-performance computing, and automotive intelligence instills confidence in the market.
In the second half of the year, there are clear signs of improvement in the supply and demand dynamics of storage chips. Demand for smartphones, laptops, and new product releases is driving positive trends. Some companies are witnessing strengthened customer demand, even accepting price increases.
In the server sector, AI servers are boosting demand for high-bandwidth memory (HBM), and DDR5 adoption is accelerating. In the automotive storage sector, electric vehicles, intelligence, and networking are propelling in-car storage demand, indicating promising developments in the automotive storage market. Other applications such as big data, cloud computing, and wearable devices related to high-speed storage, reliability, and data security also present growth potential, benefiting storage companies.
According to TrendForce, the global NAND Flash market has experienced a comprehensive price increase in the Q4, driven by suppliers’ active production reduction strategies in 2023. Data from TrendForce indicates a general rise in Q4 NAND Flash contract prices, with an increase of about 8-13%.
TrendForce estimates a negative annual growth rate of -2.8% for supply in 2023, the first in several years. This has pushed the overall sufficiency ratio to -3.7%, forming the basis for stabilizing NAND Flash prices in the second half. However, the sustainability of the current upward trend remains unclear due to the lack of substantial terminal demand.
If demand recovers as expected in the second half of 2024, especially with the momentum of AI-related orders for server SSDs and a cautious approach by suppliers in resuming capacity utilization, the overall sufficiency ratio is expected to be controlled at -9.4%, accelerating the balance between supply and demand, and NAND Flash prices may show an upward trend throughout the year.
For DRAM, TrendForce predicts a seasonal increase of about 3-8% in DRAM contract prices in the Q4. The continuation of this upward trend depends on whether suppliers maintain their production reduction strategy and the actual recovery of demand, particularly in the general server.
During the MTS 2024 Storage Industry Trends Seminar, TrendForce highlighted three concerns for the memory market in 2024:
(1) Despite the reduction in inventory levels, it is essential to observe whether this reduction can be sustained and effectively transferred to buyers.
(2) Anticipating a rise in production capacity, an early recovery in operational rates due to market improvements may lead to another imbalance in supply and demand.
(3) Whether the demand from various end-users will align with the expected recovery or not, particularly the sustainability of orders related to AI.
(Image: Samsung)
News
On November 13, NVIDIA unveiled the AI computing platform HGX H200, featuring the Hopper architecture, equipped with H200 Tensor Core GPU and high-end memory to handle the vast amounts of data generated by AI and high-performance computing.
This marks an upgrade from the previous generation H100, with a 1.4x increase in memory bandwidth and a 1.8x increase in capacity, enhancing its capabilities for processing intensive generative AI tasks.
The internal memory changes in H200 represent a significant upgrade, as it adopts the HBM3e for the first time. This results in a notable increase in GPU memory bandwidth, soaring from 3.35TB per second in H100 to 4.8TB per second.
The total memory capacity also sees a substantial boost, rising from 80GB in H100 to 141GB. When compared to H100, these enhancements nearly double the inference speed for the Llama 2 model.
H200 is designed to be compatible with systems that already support H100, according to NVIDIA. The company states that cloud service providers can seamlessly integrate H200 into their product portfolios without the need for any modifications.
This implies that NVIDIA’s server manufacturing partners, including ASRock, ASUS, Dell, Eviden, GIGABYTE, HPE, Ingrasys, Lenovo, Quanta Cloud, Supermicro, Wistron, and Wiwynn, have the flexibility to replace existing processors with H200.
The initial shipments of H200 are expected in the second quarter of 2024, with cloud service giants such as Amazon, Google, Microsoft, and Oracle anticipated to be among the first to adopt H200.
What is HBM?
“The integration of faster and more extensive HBM memory serves to accelerate performance across computationally demanding tasks including generative AI models and [high-performance computing] applications while optimizing GPU utilization and efficiency,” said Ian Buck, the Vice President of High-Performance Computing Products at NVIDIA.
What is HBM? HBM refers to stacking DRAM layers like building blocks and encapsulating them through advanced packaging. This approach increases density while maintaining or even reducing the overall volume, leading to improved storage efficiency.
TrendForce reported that the HBM market’s dominant product for 2023 is HBM2e, employed by the NVIDIA A100/A800, AMD MI200, and most CSPs’ (Cloud Service Providers) self-developed accelerator chips.
As the demand for AI accelerator chips evolves, in 2023, the mainstream demand is projected to shift from HBM2e to HBM3, with estimated proportions of approximately 50% and 39%, respectively.
As the production of acceleration chips utilizing HBM3 increases gradually, the market demand in 2024 is expected to significantly transition to HBM3, surpassing HBM2e directly. The estimated proportion for 2024 is around 60%.
Since Manufacturers plan to introduce new HBM3e products in 2024, HBM3 and HBM3e are expected to become mainstream in the market next year.
TrendForce clarifies that the so-called HBM3 in the current market should be subdivided into two categories based on speed. One category includes HBM3 running at speeds between 5.6 to 6.4 Gbps, while the other features the 8 Gbps HBM3e, which also goes by several names including HBM3P, HBM3A, HBM3+, and HBM3 Gen2.
HBM3e will be stacked with 24Gb mono dies, and under the 8-layer (8Hi) foundation, the capacity of a single HBM3e will jump to 24GB.
According to the TrendForce’s previous news release, the three major manufacturers currently leading the HBM competition – SK hynix, Samsung, and Micron – have the following progress updates.
SK hynix and Samsung began their efforts with HBM3, which is used in NVIDIA’s H100/H800 and AMD’s MI300 series products. These two manufacturers are expected to sample HBM3e in Q1 2024 previously. Meanwhile, Micron chose to skip HBM3 and directly develop HBM3e.
However, according to the latest TrendForce survey, as of the end of July this year, Micron has already provided NVIDIA with HBM3e verification, while SK hynix did so in mid-August, and Samsung in early October.
(Image: Nvidia)
News
Rumors swirl around AMD’s upcoming chip architecture, codenamed “Prometheus,” featuring the Zen 5C core. As reported by TechNews, the chip is poised to leverage both TSMC’s 3nm and Samsung’s 4nm processes simultaneously, marking a shift in the competitive landscape from process nodes, yield, and cost to factors like capacity, ecosystem, and geopolitics, are all depends on customer considerations.
Examining yields, TSMC claims an estimated 80% yield for its 4nm process, while Samsung has surged from 50% to an impressive 75%, aligning with TSMC’s standards and raising the likelihood of chip customers returning. Speculation abounds that major players such as Qualcomm and Nvidia may reconsider their suppliers, with industry sources suggesting Samsung’s 4nm capacity is roughly half of TSMC’s.
Revegnus, a reputable X(formerly Twitter) source, unveiled information from high-level Apple meetings, indicating a 63% yield for TSMC’s 3nm process but at double the price of the 4nm process. In the 4nm realm, Samsung’s yield mirrors TSMC’s, with Samsung showing a faster-than-expected yield recovery.
Consequently, with Samsung’s significant improvements in yield and capacity, coupled with TSMC’s decision to raise prices, major clients may explore secondary suppliers to diversify outsourcing orders, factoring in considerations such as cost and geopolitics. Recent reports suggest Samsung is in final negotiations for a 4nm collaboration with AMD, planning to shift some 4nm processor orders from TSMC to Samsung.
Beyond AMD, the Tensor G3 processor in Google’s Pixel 8 series this year adopts Samsung’s 4nm process. Samsung’s new fabs in Taylor, Texas, sees its inaugural customer in its Galaxy smartphones, producing Exynos processors.
Furthermore, Samsung announced that U.S.-based AI solution provider Groq will entrust the company to manufacture next-generation AI chips using the 4nm process, slated to commence production in 2025, marking the first order for the new Texas plant.
Regarding TSMC’s 4nm clients, alongside longstanding partners like Apple, Nvidia, Qualcomm, MediaTek, AMD, and Intel, indications propose a potential transition to TSMC’s 4nm process for Tensor G4, while Tensor G5 will be produced using TSMC’s 3nm process. Ending the current collaboration with Samsung, TSMC’s chip manufacturing debut is anticipated to be delayed until 2025.
Last year, rumors circulated about Tesla, the electric vehicle giant, shifting orders for the 5th generation self-driving chip, Hardware 5 (HW 5.0), to TSMC. This decision was prompted by Samsung’s lagging 4nm process yield at that time. However, with Samsung’s improved yield, industry inclination leans towards splitting orders between the two companies.
News
According to MoneyDJ’s report, Samsung Electronics, the South Korean smartphone giant, unveiled its latest foldable phones, the Galaxy Z Fold5 and Galaxy Z Flip5, in August. With a year until the next generation hits the market, speculation is arising that Samsung plans to incorporate foldable features into mid-range models. This move aims to lower the entry barrier, attract a broader customer base, and strengthen Samsung’s leading position in the foldable phone market.
TrendForce recently reported that Android smartphone brands are actively entering the foldable phone market, aiming to break through the plateau in smartphone market growth with the unique design of foldable phones. However, the widespread adoption of foldable phones faces a significant obstacle in their high pricing.
According to supply chain sources, Samsung is set to launch a mid-range foldable phone in 2024, targeting a relatively budget-friendly price range of $400 to $500 USD.
In August, Samsung launched its latest generation of foldable phones, the Galaxy Z Fold5 and Galaxy Z Flip5, maintaining a premium pricing strategy. The suggested retail prices are $1,799 USD for the Galaxy Z Fold5 and $999 USD for the Galaxy Z Flip5.
The market is eagerly anticipating Samsung’s introduction of a mid-range foldable phone. However, as of now, this remains in the speculative phase, and there’s no information available regarding its design, specifications, or other details.
Previous market rumors suggested that Samsung’s Z series of foldable phones might follow the flagship S series by introducing a “Lite Flagship” FE version. This version is expected to feature hardware downgrades to offer a more budget-friendly price, aiming to attract consumers.
According to a TrendForce’s forecast, as foldable phones gain increased acceptance in the consumer market, the global shipment volume of foldable smartphones is estimated to reach 18.3 million units in 2023. This represents a substantial 43% growth compared to 2022, although it accounts for only 1.6% of the total smartphone market sales. Looking ahead to 2024, the shipment volume is expected to grow by another 38%, reaching 25.2 million units, and the market share is projected to increase to 2.2%.
Read more
(Photo credit: Samsung)