HBM3e


2024-06-06

[News] Micron Reportedly Targets 25% HBM Market Share by 2025

Driven by the rapid growth in demand for high-bandwidth memory (HBM) fueled by artificial intelligence (AI), memory manufacturers are vying for market opportunities. According to a report from CNA, Micron has announced its target to achieve a 20% to 25% market share in HBM by 2025.

Targeting the swiftly growing demand for HBM, along with its better product pricing and profitability, the three major memory manufacturers—SK Hynix, Micron, and Samsung—are all aggressively advancing in this area. Currently, SK Hynix holds the leading position, but Micron is also making significant progress.

Micron stated that its progress of HBM3e could be contributed to the company’s advanced packaging and design capabilities, along with the integration of its own processes. The company is also developing the next generation HBM4 products.

Regarding Micron’s global capacity expansion plan, the memory heavyweight has been considering Hiroshima, Japan, as one of the potential sites. The company’s HBM capacity in fiscal year 2024 has already been sold out, of which is expected to contribute hundreds of millions of dollars in revenue.

As for its ambition regarding HBM, Micron stated that it aims to capture 20-25% market share by 2025.

Notably, per a previous report from the South Korean newspaper “Korea Joongang Daily,” following Micron’s initiation of mass production of the latest high-bandwidth memory HBM3e in February 2024, it has recently secured an order from NVIDIA for the H200 AI GPU. It is understood that NVIDIA’s upcoming H200 processor will utilize the latest HBM3e, which are more powerful than the HBM3 used in the H100 processor.

During the press conference on 5 June, Micron announced the launch of its GDDR7 graphics memory, which is currently being sampled. Utilizing Micron’s 1-beta technology, GDDR7 offers more than a 50% improvement in energy efficiency compared to the previous generation GDDR6, effectively addressing thermal issues and extending battery life.

Micron highlighted that the GDDR7 system bandwidth is increased to 1.5TB per second, 60% higher than GDDR6. Its applications range from AI and gaming to high-performance computing. The product is expected to start shipping in the second half of this year.

Read more

(Photo credit: Micron)

Please note that this article cites information from CNA and Korea Joongang Daily.

2024-06-05

[News] Jensen Huang Confirms NVIDIA Approaches Certification of Samsung’s HBM Chips

NVIDIA CEO Jensen Huang revealed that Samsung’s High Bandwidth Memory (HBM) is still striving on the certification process, but is one step away from beginning supply.

According to a report from Bloomberg on June 4th, Huang, during a briefing at the COMPUTEX, told reporters that NVIDIA is evaluating HBM provided by both Samsung and Micron Technology. Huang mentioned that there is still some engineering work needed to be completed, expressing the desire for it to have been finished already.

As per Huang, though Samsung hasn’t failed any qualification tests, its HBM product required additional engineering work. When asked about Reuter’s previous report concerning overheating and power consumption issues with Samsung’s HBM, Huang simply remarked, “there’s no story there.”

Previously, Reuters cited sources on May 24th, reporting that overheating and power consumption issues would affect Samsung’s fourth-generation HBM chip, “HBM3,” as well as the fifth-generation “HBM3e” planned for release by Samsung and its competitors this year.

Per the same report from Reuters, Samsung has been attempting to pass NVIDIA’s tests for HBM3 and HBM3e since last year, while a test for Samsung’s 8-layer and 12-layer HBM3e chips was said to fail in April.

In an official statement, Samsung noted that it is in the process of optimizing products through close collaboration with customers, with testing proceeding smoothly and as planned. The company said that HBM is a customized memory product, which requires optimization processes in tandem with customers’ needs.

Currently, SK Hynix is the primary supplier of NVIDIA’s HBM3 and HBM3e. According to an earlier TrendForce’s analysis, NVIDIA’s upcoming B100 or H200 models will incorporate advanced HBM3e, while the current HBM3 supply for NVIDIA’s H100 solution is primarily met by SK Hynix.

According to a report from the Financial Times in May, SK hynix has successfully reduced the time needed for mass production of HBM3e chips by 50%, while close to achieving the target yield of 80%.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Bloomberg, Reuters and 
 Financial Times.

2024-05-29

[News] Rumors Hint at Samsung Losing HBM Edge Due to Talent Shift to SK Hynix; SK Hynix Denies the Claims

Samsung’s HBM, according to a report from TechNews, has yet to pass certification by GPU giant NVIDIA, causing it to fall behind its competitor SK Hynix. As a result, the head of Samsung’s semiconductor division was replaced. Although Samsung denies any issues with their HBM and emphasizes close collaboration with partners, TechNews, citing market sources, indicates that Samsung has indeed suffered a setback.

Samsung invested early in HBM development and collaborated with NVIDIA on HBM and HBM2, but sales were modest. Eventually, the HBM team, according to TechNews’ report, moved to SK Hynix to develop HBM products. Unexpectedly, the surge in generative AI led to a sharp increase in HBM demand, and SK Hynix, benefitting from the trend, seized the opportunity with the help of the team.

Yet, in response to the rumors about changes in the HBM team, SK Hynix has denied the claims that SK Hynix developed HBM with the help of the Samsung team and also denied the information that Samsung’s HBM team transferred to SK Hynix. SK Hynix further emphasized the fact that SK Hynix’s HBM was developed solely by its own engineers.

Samsung’s misfortune is evident; despite years of effort, they faced setbacks just as the market took off. Samsung must now find alternative ways to catch up. The market still needs Samsung, as noted by Wallace C. Kou, President of memory IC design giant Silicon Motion.

Kou reportedly stated that Samsung remains the largest memory producer, and as NVIDIA faces a supply shortage for AI chips, the GPU giant is keen to cooperate with more suppliers. Therefore, it’s only a matter of time before Samsung supplies HBM to NVIDIA.

Furthermore, Samsung also indicated in a recent statement, addressing that they are conducting HBM tests with multiple partners to ensure quality and reliability.

In the statement, Samsung indicates that it is in the process of optimizing their products through close collaboration with its customers, with testing proceeding smoothly and as planned. As HBM is a customized memory product, it requires optimization processes in line with customers’ needs.

Samsung also states that it is currently partnering closely with various companies to continuously test technology and performance, and to thoroughly verify the quality and performance of its HBM.

On the other hand, NVIDIA has various GPUs adopting HBM3e, including H200, B200, B100, and GB200. Although all of them require HBM3e stacking, their power consumption and heat dissipation requirements differ. Samsung’s HBM3e may be more suitable for H200, B200, and AMD Instinct MI350X.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from TechNews.

2024-05-24

[News] Reasons for Samsung’s HBM Chips Failing Nvidia Tests Revealed, Reportedly Due to Heat and Power Consumption Issues

Samsung’s latest high bandwidth memory (HBM) chips have reportedly failed Nvidia’s tests, while the reasons were revealed for the first time. According to the latest report by Reuters, the failure was said to be due to issues with heat and power consumption.

Citing sources familiar with the matter, Reuters noted that Samsung’s HBM3 chips, as well as its next generation HBM3e chips, may be affected, which the company and its competitors, SK hynix and Micron, plan to launch later this year.

In response to the concerns raising by heat and power consumption regarding HBM chips, Samsung stated that its HBM testing proceeds as planned.

In an offical statement, Samsung noted that it is in the process of optimizing products through close collaboration with customers, with testing proceeding smoothly and as planned. The company said that HBM is a customized memory product, which requires optimization processes in tandem with customers’ needs.

According to Samsung, the tech giant is currently partnering closely with various companies to continuously test technology and performance, and to thoroughly verify the quality and performance of HBM.

Nvidia, on the other hand, declined to comment.

As Nvidia currently dominates the global GPU market with an 80% lion’s share for AI applications, meeting Nvidia’s stardards would doubtlessly be critical for HBM manufacturers.

Reuters reported that Samsung has been attempting to pass Nvidia’s tests for HBM3 and HBM3e since last year, while a test for Samsung’s 8-layer and 12-layer HBM3e chips was said to fail in April.

According to TrendForce’s analysis earlier, NVIDIA’s upcoming B100 or H200 models will incorporate advanced HBM3e, while the current HBM3 supply for NVIDIA’s H100 solution is primarily met by SK hynix. SK hynix has been providing HBM3 chips to Nvidia since 2022, Reuters noted.

According to a report from the Financial Times in May, SK hynix has successfully reduced the time needed for mass production of HBM3e chips by 50%, while close to achieving the target yield of 80%.

Another US memory giant, Micron, stated in February that its HBM3e consumes 30% less power than its competitors, meeting the demands of generative AI applications. Moreover, the company’s 24GB 8H HBM3e will be part of NVIDIA’s H200 Tensor Core GPUs, breaking the previous exclusivity of SK hynix as the sole supplier for the H100.

Considering major competitors’ progress on HBM3e, if Samsung fails to meet Nvidia’s requirements, the industry and investors may be more concerned on whether the Korean tech heavyweight would further fall behind its rivals in the HBM market.

Please note that this article cites information from Reuters and Financial Times.

(Photo credit: Samsung)

2024-05-24

[News] SK Hynix Revealed Progress for HBM3e, Achieving Nearly 80% Yield

SK hynix has disclosed yield details regarding the company’s 5th generation High Bandwidth Memory (HBM), HBM3e, for the first time. According to a report from the Financial Times, citing Kwon Jae-soon, the head of yield at SK hynix, the memory giant has successfully reduced the time needed for mass production of HBM3e chips by 50%, while close to achieving the target yield of 80%.

This is better than the industry’s previous speculation, which estimated the yield of SK Hynix’s HBM3e to be between 60% and 70%, according to a report by Business Korea.

According to TrendForce’s analysis earlier, NVIDIA’s upcoming B100 or H200 models will incorporate advanced HBM3e, while the current HBM3 supply for NVIDIA’s H100 solution is primarily met by SK hynix, leading to a supply shortfall in meeting burgeoning AI market demands.

The challenge, however, is the supply bottleneck caused by both CoWoS packaging constraints and the inherently long production cycle of HBM—extending the timeline from wafer initiation to the final product beyond two quarters.

The report by Business Korea noted that HBM manufacturing involves stacking multiple DRAMs vertically, which presents greater process complexity compared to standard DRAM. Specifically, the yield of the silicon via (TSV), a critical process of HBM3e, has been low, ranging from 40% to 60%, posing a significant challenge for improvement.

In terms of SK hynix’s future roadmap for HBM, CEO Kwak Noh-Jung announced on May 2nd that the company’s HBM capacity for 2024 and 2025 has almost been fully sold out. According to Business Korea, SK hynix commenced delivery of 8-layer HBM3e products in March and plans to supply 12-layer HBM3e products in the third quarter of this year. The 12-layer HBM4 (sixth-generation) is scheduled for next year, with the 16-layer version expected to enter production by 2026.

Read more

 

Please note that this article cites information from Financial Times and Business Korea.

(Photo credit: SK hynix)

  • Page 8
  • 13 page(s)
  • 61 result(s)

Get in touch with us