Micron


2024-03-06

[News] HBM Manufacturers Encounter Challenges in NVIDIA Quality Tests, Raising Concerns over Yield and Production

The surge in demand for NVIDIA’s AI processors has made High Bandwidth Memory (HBM) a key product that memory giants are eager to develop. However, according to South Korean media DealSite cited by Wccftech on March 4th, the complex architecture of HBM has resulted in low yields, making it difficult to meet NVIDIA’s testing standards and raising concerns about limited production capacity.

The report has further pointed out that HBM manufacturers like Micron and SK Hynix are grappling with low yields. They are engaged in fierce competition to pass NVIDIA’s quality tests for the next-generation AI GPU.

The yield of HBM is closely tied to the complexity of its stacking architecture, which involves multiple memory layers and Through-Silicon Via (TSV) technology for inter-layer connections. These intricate techniques increase the probability of process defects, potentially leading to lower yields compared to simpler memory designs.

Furthermore, if any of the HBM chips are defective, the entire stack is discarded, resulting in inherently low production yields. As per the source cited by Wccftech, it has indicated that the overall yield of HBM currently stands at around 65%, and attempts to improve yield may result in decreased production volume.

Micron announced on February 26th the commencement of mass production of High Bandwidth Memory “HBM3e,” to be used in NVIDIA’s latest AI chip “H200” Tensor Core GPU. The H200 is scheduled for shipment in the second quarter of 2024, replacing the current most powerful H100.

On the other hand, Kim Ki-tae, Vice President of SK Hynix, stated on February 21st in an official blog post that while external uncertainties persist, the memory market is expected to gradually heat up this year. Reasons include the recovery in product demand from global tech giants. Additionally, the application of AI in devices such as PCs or smartphones is expected to increase demand not only for HBM3e but also for products like DDR5 and LPDDR5T.

Kim Ki-tae pointed out that all of their HBM inventory has been sold out this year. Although it’s just the beginning of 2024, the company has already begun preparations for 2025 to maintain its market-leading position.

Per a previous TrendForce press release, the three major original HBM manufacturers held market shares as follows in 2023: SK Hynix and Samsung were both around 46-49%, while Micron stood at roughly 4-6%.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from MoneyDJ, DealSite and Wccftech.

2024-02-27

[News] Micron Begins Mass Production of HBM3e for NVIDIA’s H200

The U.S. memory giant Micron Technology has started the mass production of high-bandwidth memory “HBM3e,” which will be utilized in NVIDIA’s latest AI chips.

Micron stated on February 26th that HBM3e consumes 30% less power than its competitors, meeting the demands of generative AI applications. Micron’s 24GB 8H HBM3e will be part of NVIDIA’s “H200” Tensor Core GPUs, breaking the previous exclusivity of SK Hynix as the sole supplier for the H100.

Per TrendForce’s earlier research into the HBM market, it has indicated that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management. The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.

As per a previous report from NVIDIA last year, the H200 is scheduled to ship in the second quarter of this year (2024), replacing the current most powerful H100 in terms of computing power. Micron’s press release on February 26th has further solidified that Micron will begin shipping its 24GB 8H HBM3e in the second calendar quarter of 2024.

In the same press release, Micron’s Chief Business Officer, Sumit Sadana, has also indicated that“AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3e and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications.”

HBM is one of Micron’s most profitable products, and its complex construction process is part of the reason. Micron previously predicted that HBM revenue could reach hundreds of millions of dollars in 2024, with further growth expected in 2025.

Micron has further announced that it will share more about its industry-leading AI memory portfolio and roadmaps at the “GPU Technology Conference” (also known as the GTC conference) hosted by NVIDIA on March 18th.

Previously, Micron indicated in a December 2023 conference call with investors that generative AI could usher in a multi-year growth period for the company, with projected memory industry revenue reaching historic highs in 2025. They also mentioned at the time that HBM3e, developed for NVIDIA’s H200, had entered its final quality control phase.

Read more

(Photo credit: Micron)

Please note that this article cites information from Micron and NVIDIA.

2024-02-27

[News] Overview of Expansion Plans by HBM Giants

Currently, the top three leaders—Samsung, SK Hynix, and Micron—in the HBM sector are undergoing unprecedented expansion. Below is an overview of the progress made by each of these giants in the realm of HBM:

  • Samsung: HBM Production to Increase 2.5 Times in 2024, Another 2 Times in 2025

Samsung Electronics has begun expanding its HBM3 supply since the fourth quarter of 2023. Prior to this, internal messages within Samsung during the fourth quarter of 2023 indicated that samples of the next-generation HBM3e with an 8-layer stack had been provided to customers, with plans for mass production to commence in the first half of this year.

Han Jin-man, Executive Vice President in charge of Samsung’s semiconductor business in the United States, stated at CES 2024 this year that Samsung’s HBM chip production volume will increase 2.5 times compared to last year and is projected to double again next year.

Samsung officials also revealed that the company plans to increase the maximum production of HBM to 150,000 to 170,000 units per month before the fourth quarter of this year in a bid to compete for the HBM market in 2024.

Previously, Samsung Electronics spent KRW 10.5 billion to acquire the plant and equipment of Samsung Display located in Tianan City, South Korea, to expand HBM capacity. They also plan to invest KRW 700 billion to 1 trillion in building new packaging lines.

  • SK Hynix: To Commence Mass Production of World’s First Fifth-Generation High-Bandwidth Memory HBM3e in March

According to the latest report from Korean media Moneytoday on February 20th, SK Hynix will commence mass production of the world’s first fifth-generation high-bandwidth memory, HBM3e, in March this year. The company plans to supply the first batch of products to NVIDIA within the next month.

However, SK hynix noted that it “cannot confirm any details related to its partner.”

In its financial report, SK Hynix indicated plans to increase capital expenditure in 2024, with a focus on high-end storage products such as HBM. The HBM production capacity is expected to more than double compared to last year.

Previously, SK Hynix forecasted that by 2030, its HBM shipments would reach 100 million units annually. As a result, the company has decided to allocate approximately KRW 10 trillion (approximately USD 7.6 billion) in CAPEX for 2024. This represents a significant increase compared to the projected CAPEX of KRW 6 to 7 trillion in 2023, with an increase ranging from 43% to 67%.

The focus of the expansion is on constructing and expanding factories. In June of last year, Korean media reported that SK Hynix was preparing to invest in backend process equipment to expand its HBM3 packaging capabilities at its Icheon plant. By the end of this year, it is expected that the scale of backend process equipment at this plant will nearly double.

Furthermore, SK Hynix is also set to construct a state-of-the-art manufacturing facility in Indiana, USA. According to the Financial Times, this South Korean chip manufacturer will produce HBM stacks at this facility, which will be used for NVIDIA GPUs produced by TSMC.

  • Micron: Continuing the Pursuit, Betting on HBM4

Micron holds a relatively low share in the global HBM market. In order to narrow this gap, Micron has placed a significant bet on its next-generation product, HBM3e.

Sanjay Mehrotra, CEO of Micron, stated, ” Micron is in the final stages of qualifying our industry-leading HBM3e to be used in NVIDIA’s next-generation Grace Hopper GH200 and H200 platforms.”

Micron plans to begin mass shipments of HBM3e memory in early 2024. Mehrotra emphasized that their new product has garnered significant interest across the industry, implying that NVIDIA may not be the sole customer ultimately utilizing Micron’s HBM3e.

In this competition where there is no first-mover advantage, Micron seems to be betting on the yet-to-be-determined standard of the next-generation HBM4. Official announcements reveal that Micron has disclosed its next-generation HBM memory, tentatively named HBM Next. It is expected that HBM Next will offer capacities of 36GB and 64GB, available in various configurations.

Unlike Samsung and SK Hynix, Micron does not intend to integrate HBM and logic chips into a single chip. In the development of the next-generation HBM, the Korean and American memory manufacturers have distinct strategies.

Micron may address AMD, Intel, and NVIDIA that faster memory access speeds can be achieved through combination chips like HBM-GPU. However, relying solely on a single chip means greater risk.

As per TrendForce, HBM4 is planned to be launched in 2026. It is expected that specifications and performance, including those for NVIDIA and other CSP (Cloud Service Providers) in future product applications, will be further optimized.

With specifications evolving towards higher speeds, it will be the first time that the base die of HBM, also known as the Logic die, will adopt a 12nm process wafer. This part will be provided by foundries, necessitating collaboration between foundries and memory manufacturers for single HBM product integration.

Furthermore, as customer demands for computational efficiency increase, HBM4 is expected to evolve beyond the existing 12hi (12-layer) stack to 16hi (16-layer) configurations. The anticipation of higher layer counts is also expected to drive demand for new stacking methods such as hybrid bonding. HBM4 12hi products are slated for release in 2026, while 16hi products are expected to debut in 2027.

Read more

(Photo credit: Samsung)

Please note that this article cites information from WeChat account DRAMeXchangeFinancial Times and Moneytoday.

2024-01-31

[News] Flash Memory May Enter the Era of 280 Layers, and There’s More to Come

Another breakthrough has emerged in flash memory layer technology! A recent report cited by tom’s Hardware has suggested that at the upcoming International Solid-State Circuits Conference (ISSCC) in February of this year, Samsung Electronics will unveil the next-generation V9 QLC NAND solution, pushing flash memory layer technology to 280 layers.

The Battle of Layers is Far from Over

Reportedly, Samsung’s V9 QLC boasts a storage density of 28.5Gb per square millimeter, achieving a maximum transfer rate of 3.2 Gbps. This surpasses the current leading QLC products (2.4 Gbps) and is poised to meet the requirements of future PCIe 6.0 solutions.

Additionally, the report further highlights that Samsung’s V9 QLC is considered the highest-density flash memory solution to date.

Before Samsung, major storage giants such as Micron and SK Hynix had already surpassed the 200-layer milestone. Micron reached 232 layers with a storage density of 19.5Gb per square millimeter, while SK Hynix achieved 238 layers with a storage density of 14.4Gb per square millimeter.

Still, 280 layers are not the end of the storage giants’ layer count competition; there will be breakthroughs with even higher layer counts in the future.

In August 2023, SK Hynix unveiled the world’s highest-layer 321-layer NAND flash memory samples, claimed to have become the industry’s first company developing NAND flash memory with over 300 layers, with plans for mass production by 2025.

Reportedly, SK Hynix’s 321-layer 1Tb TLC NAND achieves a 59% efficiency improvement compared to the previous generation 238-layer 512Gb. This is due to the ability to stack more units of data storage to higher levels, achieving greater storage capacity on the same chip, thereby increasing the output of chips per wafer unit.

On the other hand, Micron plans to introduce higher-layer products beyond the 232-layer milestone. Samsung, with ambitious plans, aims to stack V-NAND to over 1000 layers by 2030.

Kioxia and Western Digital, after showcasing their 218-layer technology in 2023 following the 162-layer milestone, also intend to develop 3D NAND products with over 300 layers in the future.

Amid Memory Market Rebound, What’s the Trend in NAND Flash Prices?

Amid economic headwinds and subdued demand in the consumer electronics market, the memory industry experienced a prolonged period of adjustment. It wasn’t until the fourth quarter of 2023 that the memory market began to rebound, leading to improved performances for related storage giants.

According to research conducted by TrendForce, a global market research firm, NAND Flash contract prices declined for four consecutive quarters starting from the third quarter of 2022, until they began to rise in the third quarter of 2023.

With a cautious outlook for market demand in 2024, the trend in NAND Flash prices will depend on the capacity utilization rates of suppliers.

TrendForce has projected a hike of 18-23% for NAND Flash contract prices, with a more moderated QoQ price increase of 3-8% for 2Q24. As the third quarter enters the traditional peak season, the quarterly price increase could potentially expand synchronously to 8-13%.

In 4Q24, the general price rally is anticipated to continue if suppliers maintain an effective strategy for controlling output. For NAND Flash products, their contract prices are forecasted to increase by 0-5% QoQ for 4Q24.

(Photo credit: Samsung)

Please note that this article cites information from tom’s Hardware and DRAMeXchange.

2024-01-30

[News] Latest Updates on HBM from the Leading Three Global Memory Manufacturers

Amid the AI trend, the significance of high-value-added DRAM represented by HBM continues to grow.

HBM (High Bandwidth Memory) is a type of graphics DDR memory that boasts advantages such as high bandwidth, high capacity, low latency, and low power consumption compared to traditional DRAM chips. It accelerates AI data processing speed and is particularly suitable for high-performance computing scenarios like ChatGPT, making it highly valued by memory giants in recent years.

Memory is also representing one of Korea’s pillar industries, and to seize the AI opportunity and drive the development of the memory industry, Korea has recently designated HBM as a national strategic technology.

The country will provide tax incentives to companies like Samsung Electronics. Small and medium-sized enterprises in Korea can enjoy up to a 40% to 50% reduction, while large enterprises like Samsung Electronics can benefit from a reduction of up to 30% to 40%.

Overview of HBM Development Progress Among Top Manufacturers

The HBM market is currently dominated by three major storage giants: Samsung, SK Hynix, and Micron. Since the introduction of the first silicon interposer HBM product in 2014, HBM technology has smoothly transitioned from HBM, HBM2, and HBM2E to HBM3 and HBM3e through iterative innovation.

According to research by TrendForce, the mainstream HBM in the market in 2023 is HBM2e. This includes specifications used in NVIDIA A100/A800, AMD MI200, and most CSPs’ self-developed acceleration chips. To meet the evolving demands of AI accelerator chips, various manufacturers are planning to launch new products like HBM3e in 2024, expecting HBM3 and HBM3e to become the market norm.

The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.

As for the higher-spec HBM4, TrendForce expects its potential launch in 2026. With the push for higher computational performance, HBM4 is set to expand from the current 12-layer (12hi) to 16-layer (16hi) stacks, spurring demand for new hybrid bonding techniques. HBM4 12hi products are set for a 2026 launch, with 16hi models following in 2027.

Meeting Demand, Manufacturers Actively Expand HBM Production

As companies like NVIDIA and AMD continue to introduce high-performance GPU products, the three major manufacturers are actively planning the mass production of HBM with corresponding specifications.

Previously, media reports highlighted Samsung’s efforts to expand HBM production capacity by acquiring certain buildings and equipment within the Samsung Display’s Cheonan facility.

Samsung plans to establish a new packaging line at the Cheonan plant dedicated to large-scale HBM production. The company has already invested KRW 10.5 trillion in the acquisition of the mentioned assets and equipment, with an additional investment of KRW 700 billion to KRW 1 trillion.

Micron Technology’s Taichung Fab 4 in Taiwan was officially inaugurated in early November 2023. Micron stated that Taichung Fab 4 would integrate advanced probing and packaging testing functions to mass-produce HBM3e and other products, thereby meeting the increasing demand for various applications such as artificial intelligence, data centers, edge computing, and the cloud. The company plans to start shipping HBM3e in early 2024.

In its latest financial report, SK Hynix stated that in the DRAM sector in 2023, its main products DDR5 DRAM and HBM3 experienced revenue growth of over fourfold and fivefold, respectively, compared to the previous year.

At the same time, in response to the growing demand for high-performance DRAM, SK Hynix will smoothly carry out the mass production of HBM3e for AI applications and the research and development of HBM4.

Read more

(Photo credit: SK Hynix)

  • Page 15
  • 27 page(s)
  • 134 result(s)

Get in touch with us