memory


2024-06-27

[News] Memory Giant Micron’s 3Q24 Earnings Report Released, Outlook Slightly Better Than Expected

On June 26, American memory manufacturer Micron announced its financial results for the third quarter of the 2024 fiscal year (ending May 30, 2024) after the market closed: revenue increased by 82% year-over-year (17% quarter-over-quarter) to $6.811 billion; Non-GAAP diluted earnings per share (EPS) were reported at $0.62, better than the $0.42 of the second quarter of the 2024 fiscal year and the diluted loss per share of $1.43 in the third quarter of the 2023 fiscal year.

Micron further estimates that for the fourth quarter of the 2024 fiscal year, revenue and Non-GAAP diluted EPS will be $7.6 billion (plus or minus $200 million) and $1.08 (plus or minus $0.08), respectively.

Per a Bloomberg report on June 26th, some sources expect Micron’s fourth-quarter revenue to exceed USD 8 billion.

Micron CEO Sanjay Mehrotra stated in a press release that the improving market conditions and strong price and cost execution drove the financial outperformance. Reportedly, Micron’s total fiscal Q3 revenue was USD 6.8 billion, up 17% sequentially and up 82% year over year.

Mehrotra also noted that Micron’s market share for high-margin AI-related product categories such as HBM (high-bandwidth memory), high-capacity DIMMs and data center SSDs continue to rise. Meanwhile, Micron is also gaining share in data center SSD, reaching new revenue and market share records in this important product category.

Mehrotra stated during the earnings call that strong AI-driven demand for data center products has led to tight capacity for advanced processes. Therefore, despite steady recent demand for personal computers (PCs) and smartphones, Micron expects prices to continue rising throughout 2024 (January to December).

Micron CEO Sanjay Mehrotra further addressed, “In the data center, rapidly growing AI demand enabled us to grow our revenue by over 50% on a sequential basis.” He then pointed out, “…we can deliver a substantial revenue record in fiscal 2025, with significantly improved profitability underpinned by our ongoing portfolio shift to higher-margin products.”

Looking ahead to 2025, the growing demand for AI PCs, AI smartphones, and data center AI creates a favorable environment, giving Micron confidence in achieving substantial revenue records in the 2025 fiscal year. This is expected to significantly boost profitability as the product mix continues to shift towards higher-margin products.

Read more

(Photo credit: Micron)

Please note that this article cites information from Micron and Bloomberg.

2024-06-27

[News] GDDR7 Emerging as a New Driver for Memory Industry

AI applications is driving the memory market forward, with HBM (High Bandwidth Memory) undoubtedly being a sought-after product of the industry, attracting increased capital expenditure and production expansion from memory manufacturers. At the meantime, a new force in the memory market has quietly emerged: GDDR7 is expected to drive the memory market steadily forward as HBM amid the AI wave.

  • The Differences between GDDR7 and HBM

GDDR7 and HBM both belong to the category of graphics DRAM with high bandwidth and high-speed data transmission capabilities, providing strong support for AI computing. However, GDDR7 and HBM differ slightly in terms of technology, application scenarios, and performance.

GDDR7 is the latest technology in the GDDR family primarily used to enhance the available bandwidth and memory capacity of GPU. In March 2024, JEDEC, the Solid State Technology Association, officially released the JESD239 GDDR7 standard, which significantly increases bandwidth, eventually reaching 192GB/s per equipment.

It can be calculated that the memory speed is 48Gbps, double that of GDDR6X, the number of independent channels double from 2 in GDDR6 to 4 in GDDR7, and it supports densities ranging from 16-32 Gbit, including support for 2-channel mode to double system capacity.

Additionally, JESD239 GDDR7 is the first JEDEC-standard DRAM to use a Pulse Amplitude Modulation (PAM) interface for high-frequency operation. Its PAM3 interface improves the signal-to-noise ratio (SNR) in high-frequency operations while improving energy efficiency.

GDDR7 is mainly applied in graphics processing, gaming, computing, networking, and AI, particularly in gaming, where its high bandwidth and high-speed data transmission capabilities can significantly improve frame smoothness and loading speed, enabling a better experience for game players. In the field of AI, GDDR7 boasts great potential, capable of supporting rapid data processing and computation for large AI models, thus speeding up model training and inference.

Michael Litt, chairman of the JEDEC GDDR Task Group, has stated that GDDR7 is the first to focus not only on bandwidth but also on integrating the latest data integrity features to meet the market demands for RAS (Reliability, Availability, and Serviceability). These features allow GDDR devices to better serve existing markets like cloud gaming and computing, and expand its presence to AI sector.

Based on memory stacking technology, HBM connects layers through Through-Silicon Via (TSV), and features high capacity, high bandwidth, low latency, and low power consumption. Its strength lies in breaking the memory bandwidth and power consumption bottleneck. Currently, HBM is mainly used in AI server and supercomputer applications.

Since the introduction of the first generation in 2013, HBM has developed the second generation (HBM2), third generation (HBM2E), fourth generation (HBM3), and fifth generation (HBM3E).

This year, HBM3e will be the mainstream in the market, with concentrated shipments expected in 2H24. Besides, the sixth generation HBM4 is anticipated to make its debut as early as 2025. Reportedly, HBM4 will bring revolutionary changes, adopting a 2048-bit memory interface, which theoretically can double the transmission speed again.

  • Three Memory Giants Scramble for the Initiative in GDDR7 Market

Due to high technical barriers, HBM market share is firmly at the helm of the three major memory players: SK Hynix, Samsung, and Micron. With the ongoing influence of AI, their competition has been expanding from HBM to GDDR field.

Since the beginning of this year, the three manufacturers have successively announced the availability of GDDR7 memory samples. It’s expected that some of them will start mass production of GDDR7 between 4Q24 and 1Q25.

Photo credit: Samsung Electronics

In March, Samsung and SK Hynix announced their respective GDDR7 specifications. Samsung’s GDDR7 chip, using PAM3 signal for the first time, can achieve a speed of 32Gbps at a DRAM voltage of only 1.1V, exceeding the JEDEC GDDR7 specification of 1.2V.

SK Hynix’s latest GDDR7 product, compared to its predecessor GDDR6, offers a maximum bandwidth of 160GB/s, double that of the previous generation, with a 40% improvement in power efficiency and a 1.5 times increase in memory density.

In June, Micron announced it already begun sampling its new generation of GDDR7, achieving a speed of 32Gbps and a memory bandwidth of 1.5TB/sec, a 60% improvement over GDDR6, boasting the industry’s highest bit density. Micron’s GDDR7 utilizes 1β DRAM technology and an innovative architecture and has four independent channels to optimize workloads, offering faster response time, smoother gaming experience, and shorter processing time.

Additionally, Micron’s GDDR7 improves energy efficiency by 50% relative to GDDR6, which hence enhances thermal performance for portable devices (Like laptop) and extends battery lifespan. The new sleep mode can reduce standby power consumption by 70%. Micron claims its next-generation GDDR7 can deliver high performance, increasing throughput by 33% and reducing response time for generative AI workloads (Text and image creation included) by 20%.

Photo credit: Micron

Recently, rumor has it that NVIDIA RTX 50 series will fully adopt the latest GDDR7, with a maximum capacity of 16GB, including models GN22-X11 (16 GB GDDR7), GN22-X9 (16 GB GDDR7), GN22-X7 (12 GB GDDR7), GN22-X6 (8 GB GDDR7), GN22-X4 (8 GB GDDR7), and GN22-X2 (8 GB GDDR7). The industry believes that GDDR7 will become a new arena in the memory market following HBM, in which manufacturers will continue to battle for NVIDIA GPU orders.

Read more

(Photo credit: Samsung Electronics)

Please note that this article cites information from WeChat account DRAMeXchange.
2024-06-17

[News] Venturing into AI, Innolux Reportedly Partners with Memory Giants, While Its 4th Plant in Tainan to Focus on Packaging Applications

Taiwanese panel company Innolux have said to be involving in collaborating with leading global memory manufacturers. According to a report from the Economic Daily News, plans are underway to repurpose its 4th Plant in Tainan (5.5-generation LCD panel plant) for AI-related semiconductor applications, specifically targeting back-end packaging.

Sources cited in the report indicate that, based on the strategies of the top three global memory manufacturers, the partner in this collaboration is likely a memory manufacturer that already has a presence in Taiwan and seeks to expand its capacity there. Innolux’s advantage lies in its advanced panel-level fan-out packaging (FOPLP), which is poised to make a substantial impact in the AI field. However, these reports have not been confirmed by Innolux or any global memory giants.

Regarding the 4th Plant developments at Tainan, Innolux stated on June 16 that, based on flexible strategic planning principles, the company continues to optimize production configurations and enhance overall operational efficiency. Some production lines and products are being adjusted to streamline and strengthen the group’s layout and development.

The surge in AI demand has driven the need for advanced chip heterogenous integration and high-end packaging technologies to meet the high-performance application requirements of AI devices. Targeting these opportunities, Innolux has reportedly repurposed its Tainan 3.5-generation and 4-generation LCD panel production lines for semiconductor-related uses, including FOPLP and X-ray sensors.

Sources cited in the report also revealed that Innolux’s transformation efforts are making progress. After closing the 5.5-generation LCD panel production at the 4th Plant last year, the company has gradually reassigned staff to other facilities. To revitalize capacity and assets, Innolux has been in close contact with leading global memory manufacturers, aiming to develop AI-related applications.

Currently, the three major global memory manufacturers are actively developing high-bandwidth memory (HBM) for AI servers. South Korea’s SK Hynix is the most proactive in collaborating with Taiwanese companies. SK Hynix has partnered with TSMC to aggressively target the AI market. As per a report from Korean media outlet The Korea Herald,  SK Group Chairman Chey Tae-won recently visited TSMC Chairman C.C. Wei to ensure continued close cooperation on the next-generation HBM.

On the other hand, Micron has established memory production in Taiwan but does not yet have HBM capacity for AI servers in the region. Meanwhile, Samsung does not have direct AI cooperation with Taiwanese companies in the memory sector.

Sources cited in the report from Economic Daily News indicate that Innolux is engaging with one of these three major international memory manufacturers, focusing on new semiconductor applications. As Innolux is advancing into the promising glass substrate packaging business through panel-level fan-out packaging, this technology is expected to be combined with memory applications for AI development. Therefore, the developments at its 4th Plant in Tainan are receiving considerable attention.

Read more

(Photo credit: Innolux)

Please note that this article cites information from Economic Daily News and The Korea Herald.

2024-06-17

[News] With Memory Market Recovering, Kioxia Has Reportedly Ceased Production Cuts and Secured Bank Lending

According to a report from Nikkei, Japanese memory manufacturer Kioxia has ended production cuts amidst a recovery in the memory market and has secured new bank credit support. The company’s plants in Yokkaichi, Mie Prefecture, and Kitakami, Iwate Prefecture, have restored their production lines to 100% capacity, focusing mainly on NAND flash production.

With improved business conditions, creditor banks have reportedly agreed to refinance a maturing loan of JPY 540 billion (roughly USD 3.43 billion) and have established a new credit line totaling JPY 210 billion (roughly USD 1.33 billion).

Kioxia had previously implemented production cuts in October 2022 due to sluggish demand for smartphone products, reducing output by over 30%. The planned launch of new production lines at the Kitakami plant, originally scheduled for 2023, has been postponed to 2025.

The improved market environment is reflected in Kioxia’s financial report for January to March 2024, where the company achieved a net profit of JPY 10.3 billion, ending six consecutive quarters of losses. Demand for smartphone and personal computer chips has bottomed out and is starting to recover, while orders related to data centers have increased.

As per a previous TrendForce report, Kioxia’s Q1 output was still affected by production cuts from the previous quarter, resulting in a modest 7% QoQ increase in shipments. However, rising NAND Flash prices led to a 26.3% QoQ rise in revenue to $1.82 billion. Kioxia expects to grow Q2 revenue by approximately 20%, supported by increased supply bits and more flexible pricing, which will further expand enterprise SSD shipments.

Per the same report from Nikkei, led by a banking consortium including Sumitomo Mitsui Banking, Mitsubishi UFJ Financial Group, and Mizuho Bank, Kioxia’s improved performance has led to relaxed loan terms and agreement on refinancing along with new credit limits. Additionally, the banks will assist in funding for equipment upgrades.

Read more

(Photo credit: Kioxia)

Please note that this article cites information from Nikkei.

2024-06-07

[News] The HBM4 Battle Begins! Memory Stacking Challenges Remain, Hybrid Bonding as the Key Breakthrough

According to a report from TechNews, South Korean memory giant SK Hynix is participating in COMPUTEX 2024 for the first time, showcasing the latest HBM3e memory and MR-MUF technology (Mass Re-flow Molded Underfill), and revealing that hybrid bonding will play a crucial role in chip stacking.

MR-MUF technology attaches semiconductor chips to circuits, using EMC (liquid epoxy molding compound) to fill gaps between chips or between chips and bumps during stacking. Currently, MR-MUF technology enables tighter chip stacking, improving heat dissipation performance by 10%, energy efficiency by 10%, achieving a product capacity of 36GB, and allowing for the stacking of up to 12 layers.

In contrast, competitors like Samsung and Micron use TC-NCF technology (thermal compression with non-conductive film), which requires high temperatures and high pressure to solidify materials before melting them, followed by cleaning. This process involves more than 2-3 steps, whereas MR-MUF completes the process in one step without needing cleaning. As per SK Hynix, compared to NCF, MR-MUF has approximately twice the thermal conductivity, significantly impacting process speed and yield.

As the number of stacking layers increases, the HBM package thickness is limited to 775 micrometers (μm). Therefore, memory manufacturers must consider how to stack more layers within a certain height, which poses a significant challenge to current packaging technology. Hybrid bonding is likely to become one of the solutions.

The current technology uses micro bump materials to connect DRAM modules, but hybrid bonding can eliminate the need for micro bumps, significantly reducing chip thickness.

SK Hynix has revealed that in future chip stacking, bumps will be eliminated and special materials will be used to fill and connect the chips. This material, similar to a liquid or glue, will provide both heat dissipation and chip protection, resulting in a thinner overall chip stack.

SK Hynix plans to begin mass production of 16-layer HBM4 memory in 2026, using hybrid bonding to stack more DRAM layers. Kim Gwi-wook, head of SK Hynix’s advanced HBM technology team, noted that they are currently researching hybrid bonding and MR-MUF for HBM4, but yield rates are not yet high. If customers require products with more than 20 layers, due to thickness limitations, new processes might be necessary. However, at COMPUTEX, SK Hynix expressed optimism that hybrid bonding technology could potentially allow stacking of more than 20 layers without exceeding 775 micrometers.

Per a report from Korean media Maeil Business Newspaper, HBM4E is expected to be a 16-20 layer product, potentially debuting in 2028. SK Hynix plans to apply 10nm-class 1c DRAM in HBM4E for the first time, significantly increasing memory capacity.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from TechNews and the Financial Times.

  • Page 6
  • 10 page(s)
  • 47 result(s)

Get in touch with us