News
In the latest financial report and guidance released on the 20th, U.S. memory chip giant Micron outperformed analysts’ expectations for both the last quarter and the current quarter. CEO Sanjay Mehrotra believes that product pricing will rebound next year, with the upward trend continuing until 2025, as Micron aims to return to a path of operational innovation and reach new record levels by 2025, according to The Economic Daily.
Mehrotra anticipates a price recovery in memory prices next year, and rise further in 2025. He reiterated in a statement that 2024 will be a year of recovery for the memory industry setting the stage for record results in 2025.
Micron expects the supply of PC, mobile devices, and other chips to approach normal levels in the first half of next year. Despite two consecutive years of declining PC shipments, Micron forecasts low to mid-single-digit percentage growth in 2024, with signs of a recovery in smartphone demand.
TrendForce also anticipates that the upward momentum in DRAM products is expected to continue until 2025.
The reason behind this is the continuous benefit to the DRAM market from the increasing penetration of premium products such as HBM, DDR5 and LPDDR5. This is expected to have a positive impact on the overall memory prices.
Simultaneously, TrendForce believes that 2025 will witness the emergence of more edge AI applications, such as AI on smartphones or PCs. This is expected to result in an increase in DRAM capacity, becoming the driving force for the next wave of growth in DRAM demand.
(Image: Micron)
News
Micron Technology, the U.S. memory giant, has surpassed Wall Street expectations in its projected revenue for the current quarter (December-February). The main factor contributing to this success is the robust demand from data centers, offsetting the sluggish recovery in the PC and smartphone markets.
According to Micron’ released fiscal report for their first quarter (from August to November, 2023) on December 20th, Micron’s revenue rose from USD 4.01 billion in the same period last year to USD 4.73 billion.
Looking ahead to the current quarter (Q2), Micron anticipates revenue reaching USD 5.3 billion ± USD 200 million and diluted loss per share reaching USD 0.28 ± USD 0.07, which are better than market’s consensus.
Micron CEO Sanjay Mehrotra noted that strong execution and pricing strategies contributed to Q1 financial results surpassing expectations. He further stated that, ‘Demand for AI servers has been strong as data center infrastructure operators shift budgets from traditional servers to more content-rich AI servers.’
Mehrotra indicated that Micron is in the final stages of qualifying HBM3e to be used in NVIDIA’s next generation Grace Hopper GH200 and H200 platforms.
Micron now predicts that PC sales are expected to grow by a low to mid-single-digit percentage in calendar 2024, after two years of double-digit percentage PC unit volume declines. There is also hope for smartphone unit shipments to grow modestly in 2024.
For the HBM market, TrendForce’s latest research indicates that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management. Samsung’s HBM3 (24GB) is anticipated to complete verification with NVIDIA by December this year.
The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.
Given the intricacy of the HBM verification process—estimated to take two quarters—TrendForce expects that some manufacturers might learn preliminary HBM3e results by the end of 2023. However, it’s generally anticipated that major manufacturers including Samsung, SK Hynix and Micron will have definite results by 1Q24. Notably, the outcomes will influence NVIDIA’s procurement decisions for 2024, as final evaluations are still underway.
Read more
(Photo credit: Micron)
News
Micron, the American memory giant, is gearing up to initiate the production of state-of-the-art “1γ” DRAM at its Hiroshima fab in Japan, starting in 2025. Concurrently, there are plans to manufacture High-Bandwidth Memory (HBM) at the same fab, tailored for the rising demand for generative AI applications.
According to a report from Nikkei Asia on December 13th, Joshua Lee, VP at Micron Memory Japan, made this announcement during the event SEMICON Japan 2023. Lee highlighted that the Hiroshima fab is slated to manufacture DRAM with the most advanced “1γ” process by 2025. He also pointed out that Micron is also going to be the first semiconductor company to introduce Extreme Ultraviolet (EUV) lithography equipment to Japan.
In addition to this, Lee shared insights into Micron’s intentions to produce HBM at the Hiroshima fab, which is widely applied for generative AI applications. He stated that Japan’s strong semiconductor ecosystem will be a key driving force behind Micron’s progress. Furthermore, he emphasized that collaboration is pivotal for Japan to establish itself as a global leader in the semiconductor supply chain.
Earlier In October, the Ministry of Economy, Trade, and Industry (METI) of Japan announced a substantial subsidy of JPY 192 billion for Micron’s Hiroshima fab. Micron has recently declared a comprehensive investment plan of JPY 500 billion in Japan over the next few years, encompassing the Hiroshima fab.
Micron has been actively developing its DRAM manufacturing operations in Japan and Taiwan. Donghui Lu, Corporate VP of Micron Taiwan, revealed in a September interview with the UDN News that approximately 65% of Micron’s DRAM output originates from Taiwan. Regarding the migration to the 1β process, mass production began at Micron Japan last year, and Micron Taiwan has also commenced mass production this year. As for the more advanced 1γ process, production is expected to take place in both Taiwan and Japan by 2025.
TrendForce’s analysis has also revealed that Micron is leveraging its 1β nm technology to produce HBM3e in a bid to gain a competitive edge over Korean suppliers. Its front-end manufacturing is strategically positioned in Japan, aligning with expansion plans for 1β nm capacity.
Additionally, Micron has established a backend factory in Taiwan to meet surging HBM demands driven by the AI era. TrendForce anticipates that HBM products will substantially boost the revenue of DRAM suppliers in 2024.
(Image: Micron)
Explore more
News
According to TechNews’ report, in the midst of production cutbacks by Samsung, SK Hynix, and Micron, NAND Flash wafer prices are surging.
As the traditional peak season for end-user stockpiling comes to an end, memory module manufacturers wish to position themselves favorably during a dip in demand. However, the reduced supply resulting from production cutbacks paradoxically elevates the demand for NAND Flash, intensifying the momentum of price rebounds. Memory module manufacturers are left with no choice but to accept the price increases imposed by memory manufacturers.
Fueled by the expectation that memory manufacturers will continue to raise prices, memory module manufacturers continue aggressive purchasing, contributing to an upward price trend in December.
Major memory manufacturers Samsung, SK Hynix, and Micron had previously announced significant production reduction plans. Samsung initiated a decrease in NAND Flash production from the second quarter and further expanded the reduction to 50% of total capacity in September, focusing mainly on products with less than 128 layers. This move instilled confidence in price hikes among industry peers.
Due to the unexpectedly substantial reduction in production by major memory manufacturers, coupled with generally low inventory levels on the client side, NAND Flash prices continue to rise.
In the latter half of this year, the demand for Mobile DRAM and NAND Flash (eMMC, UFS) has not only been driven by the traditional peak season but also stimulated by the production expansion goals of other Chinese smartphone brands, including the Huawei Mate 60 series. This sudden influx of demand is contributing to the price hikes in fourth-quarter contracts.
The most significant price surge in this wave is undoubtedly in NAND Flash wafer prices. According to the latest research from TrendForce, the current industry situation indicates that module manufacturers’ inventories have rapidly depleted due to increased orders from customers. This has prompted module manufacturers to turn to memory manufacturers, requesting expanded supply.
However, with memory manufacturers persisting in their production reduction strategies, the imbalance between supply and demand has led to a robust rebound in NAND Flash wafer prices in the fourth quarter. According to TrendForce’s data, the month of November alone witnessed a price increase of over 25% for NAND Flash wafers.
Industry sources reveal that in the current scenario of limited supply and significantly increased demand, module manufacturers have no choice but to accept the forceful price hikes imposed by memory manufacturers. The industry, anticipating that memory manufacturers will continue to raise prices, has resulted in a situation where “Everyone just keeps scrambling for inventory.”
Based on the current market conditions, TrendForce believes that in December, with tight supply, NAND Flash prices will continue to rise. However, whether prices will continue to surge significantly in the first quarter next year depends on the production reduction strategies of NAND manufacturers and the state of demand.
It is reported that there are industry rumors suggesting that some memory manufacturers are considering increasing production capacity due to the strong downstream demand. If memory manufacturers decide to increase its capacity earlier, coupled with unclear improvements in demand, the extent of price increases may be noticeably limited.
(Photo credit: Samsung)
News
Market reports suggest Nvidia’s new product release cycle has shortened from two to a year, sparking intense competition among major memory companies in the realm of next-gen High Bandwidth Memory (HBM) technology. Samsung, SK Hynix, and Micron are fervently competing, with SK Hynix currently holding the dominant position in the HBM market. However, Micron and Samsung are strategically positioned, poised for a potential overtake, reported by TechNews.
Current Status of the HBM Industry
SK Hynix made a breakthrough in 2013 by successfully developing and mass-producing HBM using the Through Silicon Via (TSV) architecture. In 2019, they achieved success with HBM2E, maintaining the overwhelming advantage in the HBM market. According to the latest research from TrendForce, Nvidia plan to partner with more HBM suppliers. Samsung, as one of the suppliers, its HBM3 (24GB) is anticipated to complete verification with NVIDIA by December this year.
Regarding HBM3e progress, Micron, SK Hynix, and Samsung provided 8-layer (24GB) Nvidia samples in July, August, and October, respectively, with the fastest verification expected by year-end. All three major players anticipate completing verification in the first quarter of 2024.
As for HBM4, the earliest launch is expected in 2026, with a stack increase to 16 layers from the existing 12 layers. The memory stack will likely adopt a 2048-bit memory stack connection interface, driving demand for the new “Hybrid Bonding” stacking method. The 12-layer HBM4 product is set to launch in 2026, followed by the 16-layer product expected in 2027.
Navigating HBM4, the New Technologies and Roadmaps of Memory Industry Leaders
SK Hynix
According to reports from Business Korea, SK Hynix is preparing to adopt “2.5D Fan-Out” packaging for the next-generation HBM technology. This move aims to enhance performance and reduce packaging costs. This technology, not previously used in the memory industry but common in advanced semiconductor manufacturing, is seen as having the potential to “completely change the semiconductor and foundry industry.” SK Hynix plans to unveil research results using this packaging method as early as next year.
The 2.5D Fan-Out packaging technique involves arranging two DRAM horizontally and assembling them similar to regular chips. The absence of a substrate beneath the chips allows for thinner chips, significantly reducing the thickness when installed in IT equipment. Simultaneously, this technique bypasses the Through Silicon Via (TSV) process, providing more Input/Output (I/O) options and lowering costs.
According to their previous plan, SK Hynix aims to mass-produce the sixth-generation HBM (HBM4) as early as 2026. The company is also actively researching “Hybrid Bonding” technology, likely to be applied to HBM4 products.
Currently, HBM stacks are placed on the interposer next to or GPUs and are connected to their interposer. While SK Hynix’s new goal is to eliminate the interposer completely, placing HBM4 directly on GPUs from companies like Nvidia and AMD, with TSMC as the preferred foundry.
Samsung
Samsung is researching the application of photonics in HBM technology’s interposer layer, aiming to address challenges related to heat and transistor density. Yan Li, Principal Engineer in Samsung’s advanced packaging team, shared insights at the OCP Global Summit in October 2023.
(Image: Samsung)
According to Samsung, The industry has made significant strides in integrating photonics with HBM through two main approaches. One involves placing a photonics interposer between the bottom packaging layer and the top layer containing GPU and HBM, acting as a communication layer. However, this method is costly, requiring an interposer and photon I/O for logic chips and HBM.
(Image: Samsung)
The alternative approach separates the HBM memory module from packaging, directly connecting it to the processor using photonics. Rather than dealing with the complexity of packaging, a more efficient approach is to separate the HBM memory module from the chip itself and connect it to the logic IC using photonics technology. This approach not only simplifies the manufacturing and packaging costs for HBM and logic ICs but also eliminates the need for internal digital-to-optical conversions in the circuitry. However, careful attention is required to address heat dissipation.
Micron
As reported by Tom’s Hardware, Micron’s 8-layer HBM3e (24GB) is expected to launch in early 2024, contributing to improved AI training and inference performance. The 12-layer HBM3e (36GB) chip is expected to debut in 2025.
Micron is working on HBM4 and HBM4e along with other companies. The required bandwidth is expected to exceed 1.5 TB/s. Micron anticipates launching 12-layer and 16-layer HBM4 with capacities of 36GB to 48GB between 2026 and 2027. After 2028, HBM4E will be introduced, pushing the maximum bandwidth beyond 2+ TB/s and increasing stack capacity to 48GB to 64GB.
Micron is taking a different approach from Samsung and SK Hynix by not integrating HBM and logic chips into a single die, suggested by Chinese media Semiconductor Industry Observation. This difference in strategy may lead to distinct technical paths, and Micron might advise Nvidia, Intel, AMD that relying solely on the same company’s chip carries greater risks.
(Image: Micron)
TSMC Aids Memory Stacking
Currently, TSMC 3DFabric Alliance closely collaborates with major memory partners, including Micron, Samsung, and SK Hynix. This collaboration ensures the rapid growth of HBM3 and HBM3e, as well as the packaging of 12-layer HBM3/HBM3e, by providing more memory capacity to promote the development of generative AI.
(Image: TSMC)
(Image: SK Hynix)
Explore more