NAND Flash


2024-06-27

[News] NAND Flash Giant Kioxia Reportedly Plans IPO by Late October amid Market Recovery

According to a Reuters report on June 26th citing sources, with semiconductor market conditions rebounding and financial performance rapidly improving, NAND flash leader Kioxia is reportedly gearing up to file a preliminary application soon and aims to debut on the Tokyo Stock Exchange (TSE) through an initial public offering (IPO) by late October.

As per the same report citing sources, Kioxia plans to formally submit its IPO application by the end of August, aiming for a listing by late October. In order to meet the deadline, preparations are proceeding at a faster pace than usual for an IPO, although the timing may be subject to progress and could potentially be delayed until December. The sources further indicated that Bain Capital, a major shareholder of Kioxia, plans to sell part of its stake through the IPO to raise funds.

Kioxia previously obtained approval for listing on the Tokyo Stock Exchange in 2020 but postponed its IPO plans due to the US-China trade tensions and adverse market conditions. The source cited in the report mentioned that the funds raised through this IPO might be lower than its initial valuation in 2020.

Toshiba spun off its semiconductor business, which focused on NAND flash, in April 2017. The company is previously named “Toshiba Memory,” which was later renamed to “Kioxia” on October 1, 2019. Toshiba currently holds approximately 40% of Kioxia’s shares.

Previously on May 15th, the improved market environment is also reflected in Kioxia’s financial report for January to March 2024, where the company achieved a net profit of JPY 10.3 billion, ending six consecutive quarters of losses.

This turnaround was driven by improved pricing due to production cuts across various NAND Flash manufacturers, which balanced supply and demand. The consolidated operating profit improved from a loss of JPY 171.4 billion in the same period last year to a profit of JPY 43.9 billion, marking the first quarterly profit in six quarters. Notably, the demand for smartphone and personal computer chips has bottomed out and is starting to recover, while orders related to data centers have increased.

Looking ahead to market trends and future prospects, Kioxia pointed out the normalization of customer inventory levels, which is expected to drive recovery in demand for PC and smartphone applications. They anticipate future growth driven by the introduction of On-Device AI, increasing memory capacities, and potential upgrades in PC operating systems stimulating replacement demand.

Read more

(Photo credit: Kioxia)

Please note that this article cites information from Reuters.

2024-06-27

[News] Outpacing Samsung or the End of Race? Kioxia Aims 1000-layer NAND by 2027

After ending production cuts amidst a recovery in the memory industry, Kioxia disclosed its plans on the 3D NAND roadmap last week. According to reports from PC Watch and Blocks & Files, Kioxia stated that achieving a 1,000-layer level by 2027 would be possible.

According to the reports, the number of 3D NAND layers has generally increased from 24 in 2014 to 238 in 2022, representing a tenfold rise over eight years. Kioxia stated that achieving a 1,000-layer level by 2027 would be possible at a rate of increase of 1.33 times per year.

The Japanese memory chipmaker seems to be more ambitious than Samsung regarding the battle of layers. In May, Samsung revealed its target to release advanced NAND chips with over 1000 layers by 2030. According to Wccftech, the South Korean memory giant plans to apply new ferroelectric materials on the manufacturing of NAND to achieve this goal.

According to the latest analysis from TrendForce, Kioxia has benefited from the recovery of the memory industry, recently receiving subsidies from the Japanese government and additional financing from a consortium of banks. Furthermore, the company plans to launch an IPO by the end of the year. These measures have provided Kioxia with ample financial resources to pursue technological advancements and cost optimization.

TrendForce further notes that Kioxia has ambitious plans to achieve 1000-layer technology by 2027, which is the highest number of layers announced by any manufacturer so far. However, to reach the milestone, it will be necessary to transition from TLC (3 bits per cell) to QLC (4 bits per cell), and possibly even to PLC (5 bits per cell). The technical challenges involved are significant, and whether Kioxia can achieve this market milestone by 2027 remains to be seen.

The Battle of Layers between Memory Giants

Kioxia and its partner Western Digital showcased their 218-layer technology in 2023 following the 162-layer milestone. Its current announcement to achieve the 1000-layer technology by 2027 would be a huge leap from that.

The battle of layers between memory giants has been intensifying as other memory heavyweights had already surpassed the 200-layer milestone. Earlier in April, Samsung confirmed that it has begun mass production for its one-terabit (Tb) triple-level cell (TLC) 9th-generation vertical NAND (V-NAND), with the number of layers reaching 290, according an earlier report by The Korea Economic Daily. For now, the company aims to stack V-NAND to over 1000 layers by 2030.

SK Hynix unveiled the world’s highest-layer 321-layer NAND flash memory samples in August 2023, claiming to have become the industry’s first company developing NAND flash memory with over 300 layers, with plans for mass production by 2025. Micron has also started to mass produce its 232-layer QLC NANDs in 2024.

Uncertainties behind Kioxia’s Optimism

However, to Kioxia, there are more challenges to overcome, as technological obstacles and Western Digital’s stance add uncertainties to its ambition. According to the report from Blocks & Files, increasing density in a 3D NAND die involves more than just adding layers, as each layer’s edge must be exposed for memory cell electrical connectivity. This results in a staircase-like profile, and as the number of layers grows, the die area needed for the staircase expands as well.

Therefore, to increase density, it is necessary to shrink the cell size both vertically and laterally, and to raise the bit level as well. All these scaling factors, including layer counts, vertical cell size reduction, lateral cell size reduction, and cell bit level increases, present their own technological challenges.

Moreover, according to Blocks & Files, WD has concerns regarding the manufacturing capital costs and the return on investment from selling chips and SSDs made with the fabricated NAND dies.

Citing Western Digital EVP Robert Soderbery in June, the report noted that in the 3D era, NAND manufacturing requires higher capital intensity but offers a lower cost reduction as bit density increases. The company even described the situation as the “end of the layers race,” indicating that there would be a slowdown in the rate of NAND layer count increases to optimize capital deployment.

How long would the battle of layers continue, and how far would it go? Technological breakthroughs as well as the willingness to endure higher capital intensity while the cost reduction being relatively limited may be key.

Read more

(Photo credit: Kioxia)

Please note that this article cites information from Blocks&Files and PC Watch.
2024-06-27

[News] GDDR7 Emerging as a New Driver for Memory Industry

AI applications is driving the memory market forward, with HBM (High Bandwidth Memory) undoubtedly being a sought-after product of the industry, attracting increased capital expenditure and production expansion from memory manufacturers. At the meantime, a new force in the memory market has quietly emerged: GDDR7 is expected to drive the memory market steadily forward as HBM amid the AI wave.

  • The Differences between GDDR7 and HBM

GDDR7 and HBM both belong to the category of graphics DRAM with high bandwidth and high-speed data transmission capabilities, providing strong support for AI computing. However, GDDR7 and HBM differ slightly in terms of technology, application scenarios, and performance.

GDDR7 is the latest technology in the GDDR family primarily used to enhance the available bandwidth and memory capacity of GPU. In March 2024, JEDEC, the Solid State Technology Association, officially released the JESD239 GDDR7 standard, which significantly increases bandwidth, eventually reaching 192GB/s per equipment.

It can be calculated that the memory speed is 48Gbps, double that of GDDR6X, the number of independent channels double from 2 in GDDR6 to 4 in GDDR7, and it supports densities ranging from 16-32 Gbit, including support for 2-channel mode to double system capacity.

Additionally, JESD239 GDDR7 is the first JEDEC-standard DRAM to use a Pulse Amplitude Modulation (PAM) interface for high-frequency operation. Its PAM3 interface improves the signal-to-noise ratio (SNR) in high-frequency operations while improving energy efficiency.

GDDR7 is mainly applied in graphics processing, gaming, computing, networking, and AI, particularly in gaming, where its high bandwidth and high-speed data transmission capabilities can significantly improve frame smoothness and loading speed, enabling a better experience for game players. In the field of AI, GDDR7 boasts great potential, capable of supporting rapid data processing and computation for large AI models, thus speeding up model training and inference.

Michael Litt, chairman of the JEDEC GDDR Task Group, has stated that GDDR7 is the first to focus not only on bandwidth but also on integrating the latest data integrity features to meet the market demands for RAS (Reliability, Availability, and Serviceability). These features allow GDDR devices to better serve existing markets like cloud gaming and computing, and expand its presence to AI sector.

Based on memory stacking technology, HBM connects layers through Through-Silicon Via (TSV), and features high capacity, high bandwidth, low latency, and low power consumption. Its strength lies in breaking the memory bandwidth and power consumption bottleneck. Currently, HBM is mainly used in AI server and supercomputer applications.

Since the introduction of the first generation in 2013, HBM has developed the second generation (HBM2), third generation (HBM2E), fourth generation (HBM3), and fifth generation (HBM3E).

This year, HBM3e will be the mainstream in the market, with concentrated shipments expected in 2H24. Besides, the sixth generation HBM4 is anticipated to make its debut as early as 2025. Reportedly, HBM4 will bring revolutionary changes, adopting a 2048-bit memory interface, which theoretically can double the transmission speed again.

  • Three Memory Giants Scramble for the Initiative in GDDR7 Market

Due to high technical barriers, HBM market share is firmly at the helm of the three major memory players: SK Hynix, Samsung, and Micron. With the ongoing influence of AI, their competition has been expanding from HBM to GDDR field.

Since the beginning of this year, the three manufacturers have successively announced the availability of GDDR7 memory samples. It’s expected that some of them will start mass production of GDDR7 between 4Q24 and 1Q25.

Photo credit: Samsung Electronics

In March, Samsung and SK Hynix announced their respective GDDR7 specifications. Samsung’s GDDR7 chip, using PAM3 signal for the first time, can achieve a speed of 32Gbps at a DRAM voltage of only 1.1V, exceeding the JEDEC GDDR7 specification of 1.2V.

SK Hynix’s latest GDDR7 product, compared to its predecessor GDDR6, offers a maximum bandwidth of 160GB/s, double that of the previous generation, with a 40% improvement in power efficiency and a 1.5 times increase in memory density.

In June, Micron announced it already begun sampling its new generation of GDDR7, achieving a speed of 32Gbps and a memory bandwidth of 1.5TB/sec, a 60% improvement over GDDR6, boasting the industry’s highest bit density. Micron’s GDDR7 utilizes 1β DRAM technology and an innovative architecture and has four independent channels to optimize workloads, offering faster response time, smoother gaming experience, and shorter processing time.

Additionally, Micron’s GDDR7 improves energy efficiency by 50% relative to GDDR6, which hence enhances thermal performance for portable devices (Like laptop) and extends battery lifespan. The new sleep mode can reduce standby power consumption by 70%. Micron claims its next-generation GDDR7 can deliver high performance, increasing throughput by 33% and reducing response time for generative AI workloads (Text and image creation included) by 20%.

Photo credit: Micron

Recently, rumor has it that NVIDIA RTX 50 series will fully adopt the latest GDDR7, with a maximum capacity of 16GB, including models GN22-X11 (16 GB GDDR7), GN22-X9 (16 GB GDDR7), GN22-X7 (12 GB GDDR7), GN22-X6 (8 GB GDDR7), GN22-X4 (8 GB GDDR7), and GN22-X2 (8 GB GDDR7). The industry believes that GDDR7 will become a new arena in the memory market following HBM, in which manufacturers will continue to battle for NVIDIA GPU orders.

Read more

(Photo credit: Samsung Electronics)

Please note that this article cites information from WeChat account DRAMeXchange.
2024-06-26

[Insights] Memory Spot Price Update: DRAM Remains Weak Despite Samsung’s Capacity Shift to HBM

According to TrendForce’s latest memory spot price trend report, the spot price of DRAM remains weak, as Samsung’s reallocation of its D1A process to the manufacturing of HBM products did little help. As for NAND flash, overall transactions are also sitting on the enervated end due to weakening market demand. Details are as follows:

DRAM Spot Price:

A fire-related incident occurred at Micron’s fab in Taichung on June 20th, but no actual losses (in bit terms) have been reported. In response to this event, module houses did temporarily suspend quoting, but they soon resumed trading activities. Overall, the event has had no positive effect on the spot price trend, which remains relatively weak. Similar to last week, spot trading has been fairly tepid, and prices of DDR4 products have fallen more significantly compared to DDR5 products. With Samsung reallocating its D1A process to the manufacturing of HBM products, spot prices of DDR5 products have actually experienced sporadic hikes for a while. Mainstream die DDR4 1Gx8 2666 MT/s saw a price increase of 1.36% this week (US$1.835 to US$1.860).

NAND Flash Spot Price:

Module houses have started adopting even more aggressive pricing strategies to effectively control their inventory, though overall transactions are sitting on the enervated end due to weakening market demand. TrendForce believes that inventory pressure would continue to bring down spot prices, which dropped to US$3.302 for 512Gb TLC wafers this week at a 0.21% reduction.

2024-06-26

[News] Memory Giants Samsung and Micron Rumored to Expand Production

Recently, it was reported that to meet the increasing demand for memory chips driven by the artificial intelligence (AI) boom, both Samsung Electronics and Micron set about ramping up their memory chip production capacity. Samsung plans to restart construction of the new Pyeongtaek plant (P5) infrastructure as early as 3Q24. Micron is building HBM testing and mass production lines at its headquarters in Boise, Idaho, U.S. and is considering producing HBM in Malaysia for the first time to meet the growing demand brought by the AI surge.

Samsung Restarts the Construction of P5 Plant

As per foreign media reports, Samsung has decided to restart the construction of the P5 infrastructure, which is expected to resume as early as 3Q24 and be completed in April 2027, though the actual date of starting production could be earlier.

Previously, P5 reportedly suspended construction at the end of January, which was said to be a temporary measure to coordinate progress, with investment not yet been finalized, as stated by Samsung at that time. Industry analysts interpret Samsung’s decision to resume P5 construction as a response to the AI-driven surge in demand for memory chip.

It is reported that the Samsung P5 plant is a large wafer fab with eight cleanrooms, while P1 to P4 only have four respectively, which makes Samsung’s plan to achieve large-scale production to meet market demand possible. However, no official announcement regarding the specific use of P5 has been disclosed so far.

According to Korean media reports, industry sources stated that Samsung held an internal management committee conference of the board of directors on May 30, during which they submitted and passed the agenda for the P5 infrastructure construction. The management committee was chaired by CEO and head of the DX division, Jong-hee Han, involving other members such as MX business head Noh Tae-moon, management support director Park Hak-gyu, and head of the memory business division Lee Jeong-bae.

Hwang Sang-joong, vice president and head of DRAM Product and Technology at Samsung, stated in March this year that HBM output for this year was expected to be 2.9 times that of last year. The company also announced its HBM roadmap, projecting that HBM shipment in 2026 would be 13.8 times the 2023 output, and by 2028, the annual HBM output would further increase to 23.1 times the 2023 level.

Micron Builds HBM Testing and Mass-Production Lines in the U.S.

On June 19, multiple media reported that Micron is building HBM testing and mass production lines at its headquarters in Boise, Idaho, and is considering producing HBM in Malaysia for the first time to meet the increased demand driven by the AI boom. Micron’s Boise wafer fab is reportedly to put into operation in 2025 and start DRAM production in 2026.

Previously, Micron announced plans to increase its HBM market share from the current “mid-single digits” to around 20% within a year. As of now, Micron has been expanding its memory capacity in various locations.

At the end of April, Micron officially announced that it had received USD 6.1 billion of government subsidies from the U.S. CHIPS and Science Act. These funds, along with additional state and local incentives, will support Micron in building a leading DRAM memory manufacturing plant in Idaho and two advanced DRAM memory manufacturing plants in Clay, New York.

The Idaho plant commenced construction in October 2023. Micron revealed that the plant is expected to run in 2025 and start DRAM production in 2026, with DRAM output increasing in line with industry demand. The New York project is in the phase of initial design, field study, and license application (NEPA application included). Construction of the wafer fab is expected to begin in 2025 and production in 2028, which will increase depending on market demand over the next decade. The press release noted that the U.S. government’s subsidies will support Micron’s plan to invest around USD 50 billion of total capital expenditures to lead domestic memory manufacturing by 2030.

In May, Japanese media Nikkan Kogyo Shimbun reported that Micron will pour JPY 600 to 800 billion to build an advanced DRAM chip plant using EUV lithography in Hiroshima, Japan. Construction is expected to start in early 2026 and be completed in late 2027 at the earliest. Japan had previously approved up to JPY 192 billion of subsidies to support Micron’s plant construction and next-generation chip production in Hiroshima.

The new Micron plant in Hiroshima will be located near the existing Fab 15, focusing on DRAM production, excluding back-end packaging and testing, with priority given to the fabrication of HBM products.

In October 2023, Micron inaugurated its second smart (Advanced assembly and test) plant in Penang, Malaysia, with an initial investment of USD 1 billion. Following the completion of the first plant, Micron allocated an additional USD 1 billion to expand the second smart plant, expanding its building area to 1.5 million square feet.

(Photo credit: Samsung)

 

  • Page 12
  • 34 page(s)
  • 169 result(s)

Get in touch with us