HBM3e


2024-05-20

[News] CoWoS Production Capacity Reportedly Falls Short of GPU Demand

The world’s four major CSPs (Cloud Service Providers) – Microsoft, Google, Amazon, and META – are continuously expanding their AI infrastructure, with their combined capital expenditures projected to reach USD 170 billion this year. According to the industry sources cited in a report from Commercial Times, it’s pointed out that due to the surge in demand for AI chips and the increased area of silicon interposers, the number of chips that can be produced from a single 12-inch wafer is decreasing. This situation is expected to cause the CoWoS (Chip on Wafer on Substrate) production capacity under TSMC to remain in short supply.

Regarding CoWoS, according to TrendForce, the introduction of NVIDIA’s B series, including GB200, B100, B200, is expected to consume more CoWoS production capacity. TSMC has also increased its demand for CoWoS production capacity for the entire year of 2024, with estimated monthly capacity approaching 40,000 by the year-end, compared to an increase of over 150% from the total capacity in 2023. A possibility exists for the total production capacity to nearly double in 2025.

However, with NVIDIA releasing the B100 and B200, the interposer area used by a single chip will be larger than before, meaning the number of interposers obtained from a 12-inch wafer will further decrease, resulting in CoWoS production capacity being unable to meet GPU demand. Meanwhile, the number of HBM units installed is also multiplying.

Moreover, in CoWoS, multiple HBMs are placed around the GPU, and HBMs are also considered one of the bottlenecks. Industry sources indicate that HBM is a significant challenge, with the number of EUV (Extreme Ultraviolet Lithography) layers gradually increasing. For example, SK Hynix, which holds the leading market share in HBM, applied a single EUV layer during its 1α production phase. Starting this year, the company is transitioning to 1β, potentially increasing the application of EUV by three to four times.

In addition to the increased technical difficulty, the number of DRAM units within HBM  has also increased with each iteration. The number of DRAMs stacked in HBM2 ranges from 4 to 8, while HBM3/3e increases this to 8 to 12, and HBM4 will further raise the number of stacked DRAMs to 16.

Given these dual bottlenecks, overcoming these challenges in the short term remains difficult. Competitors are also proposing solutions; for instance, Intel is using rectangular glass substrates to replace 12-inch wafer interposers. However, this approach requires significant preparation, time, and research and development investment, and breakthroughs from industry players are still awaited.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from Commercial Times.

2024-05-06

[News] SK Hynix Reportedly Raises Prices Again, with DRAM Products to Increase by 15-20%

The surge in memory product prices continues, driven by the AI wave revitalizing the memory market. According to a report from Liberty Times Net, prices of high-performance DRAM are also on the rise. Industry sources cited by the same report have indicated that SK Hynix’s LPDDR5/LPDDR4/DDR5 and other DRAM products will see a comprehensive price hike of 15-20%.

According to a report from Chinese media Wallstreetcn, it has cited industry sources, noting that SK Hynix’s DRAM product prices have been steadily increasing month by month since the fourth quarter of last year, with cumulative increases ranging from approximately 60% to 100%. This upward trend in memory prices is expected to continue until the second half of the year.

On April 25th, SK Hynix announced its first-quarter financial results, with revenue soaring to KRW 12.42 trillion, marking a staggering 144.3% increase compared to the same period last year. Operating profit reached KRW 2.88 trillion, far exceeding market expectations of KRW 1.8 trillion, and achieving the second-highest historical figure for the same period.

Contrasting with the loss of KRW 3.4 trillion in the same period last year, this performance represents a significant turnaround for SK Hynix, signaling a shift from a prolonged period of stagnation to comprehensive recovery.

Looking ahead, SK Hynix expressed optimism, stating that the growing demand for memory driven by AI and the recovery of demand for general DRAM products starting from the second half of this year will contribute to a stable growth trend in the memory market for the rest of the year.

Industry sources cited by the report predict that as demand for high-end products like HBM increases, requiring larger capacity compared to general DRAM products, the increase in output of high-end products will lead to a relative decrease in supply of general DRAM products. Consequently, both suppliers and clients are expected to deplete their inventories.

In line with the trend of growing memory demand for AI applications, SK Hynix has decided to ramp up the production of its HBM3e products, which began global production in March this year, and expand its customer base. Additionally, the company plans to launch its fifth-generation 10-nanometer class (1b) 32Gb DDR5 DRAM products within this year, aiming to strengthen its market leadership in high-capacity DRAM products for servers.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from Liberty Times Net and Wallstreetcn.

2024-05-02

[News] HBM Craze Continues! SK Hynix Reports Sold Out for this Year, Next Year’s HBM Capacity Nearly Fully Booked

SK Hynix CEO Kwak Noh-Jung announced on May 2nd that the company’s HBM capacity for this year has already been fully sold out, and next year’s capacity is also nearly sold out. From a technological perspective, SK Hynix plans to provide samples of the world’s highest-performance 12-layer stacked HBM3e products in May this year and is preparing for mass production starting in the third quarter.

SK Hynix just held a press conference in South Korea, where they disclosed information regarding their AI memory technology capabilities, market status, and investment plans for future major production sites in Cheongju and Yongin, South Korea, as well as in the United States.

Kwak Noh-Jung pointed out that although AI is currently primarily centered around data centers, it is expected to rapidly expand to on-device AI applications in smartphones, PCs, cars, and other end devices in the future. Consequently, the demand for memory specialized for AI, characterized by “ultra-fast, high-capacity and low-power,” is expected to skyrocket.

Kwak Noh-Jung stated that SK Hynix possesses industry-leading technological capabilities in various product areas such as HBM, TSV-based high-capacity DRAM, and high-performance eSSD. In the future, SK Hynix looks to provide globally top-tier memory solutions tailored to customers’ needs through strategic partnerships with global collaborators.

Looking ahead to AI memory, SK Hynix President Justin Kim pointed out that as we enter the era of AI, the global volume of data generated is expected to grow from 15 Zettabytes (ZB) in 2014 to 660 ZB by 2030. Simultaneously, the proportion of revenue from AI memory is also expected to increase significantly. Memory technologies oriented towards AI, such as HBM and high-capacity DRAM modules, are projected to account for about 5% of the entire memory market in 2023 (in terms of revenue), with expectations to reach 61% by 2028.

Additionally, the company will advance collaboration with top-tier partners in the system semiconductor and foundry fields globally, aiming to timely develop and provide the best products.

Regarding SK Hynix’s packaging technology capabilities, the company highlighted its MR-MUF technology as one of its core packaging technologies. While there may be bottlenecks in high-layer stacking, SK Hynix emphasized that this is not the case in practice. The company has already begun mass production of 12-layer stacked HBM3 products using advanced MR-MUF technology.

Reducing the pressure on chip stacking to 6% has not only shortened process time but also increased production efficiency by up to 4 times while enhancing heat dissipation by 45%. Moreover, the latest MR-MUF technology from SK Hynix utilizes new protective materials, resulting in a 10% improvement in heat dissipation. Additionally, the advanced MR-MUF technology employs superior high-temperature/low-pressure methods for warpage control, making it the most suitable solution for high-layer stacking.

Furthermore, SK Hynix plans to adopt advanced MR-MUF technology in HBM4 to achieve 16-layer stacking and is actively researching hybrid bonding technology. Lastly, in terms of investments in the United States, SK Hynix has confirmed the construction of an advanced packaging production facility for AI memory in Indiana. This facility is scheduled to commence mass production of the next-generation HBM products in the second half of 2028.

Read more

(Photo credit: SK Hynix)

2024-04-26

[News] Samsung Reportedly Signs USD 3 Billion HBM3e Deal with AMD

According to a report from Korean media outlet viva100, Samsung has signed a new USD 3 billion agreement with processor giant AMD to supply HBM3e 12-layer DRAM for use in the Instinct MI350 series AI chips. Reportedly, Samsung has also agreed to purchase AMD GPUs in exchange for HBM products, although details regarding the specific products and quantities involved remain unclear.

Earlier market reports indicated that AMD plans to launch the Instinct MI350 series in the second half of the year as an upgraded version of the Instinct MI300 series. The MI350 series is reportedly expected to adopt TSMC’s 4-nanometer process, delivering improved computational performance with lower power consumption. The inclusion of 12-layer stacked HBM3e memory will enhance both bandwidth and capacity.

In October 2023, at Samsung Memory Tech Day 2023, Samsung announced the launch of a new HBM3e codenamed “Shinebolt.” In February of this year, Samsung unveiled the industry’s first HBM3e 12H DRAM, featuring 12 layers and a capacity of 36GB, marking the highest bandwidth and capacity HBM product to date. Samsung has provided samples and plans to commence mass production in the second half of the year.

Samsung’s HBM3e 12H DRAM offers up to 1280GB/s bandwidth and 36GB capacity, representing a 50% increase compared to the previous generation of eight-layer stacked memory. Advanced Thermal Compression Non-Conductive Film (TC NCF) technology enables the 12-layer stack to meet HBM packaging requirements while maintaining chip height consistency with eight-layer chips.

Additionally, optimizing the size of chip bumps improves HBM thermal performance, with smaller bumps located in signal transmission areas and larger bumps in heat dissipation areas, contributing to higher product yields.

The adoption of HBM3e 12-layer DRAM over HBM3e 8-layer DRAM has shown an average speed improvement of 34% in AI applications, with inference service users increasing by over 11.5 times.

In view of this matter, industry sources cited by the report from TechNews has indicated that this deal is separate from negotiations between AMD and Samsung Foundry for wafer production. AMD plans to assign a portion of new CPUs/GPUs to Samsung for manufacturing, which is unrelated to this specific transaction.

Read more

(Photo credit: Samsung)

Please note that this article cites information from viva100 and TechNews.

2024-03-28

[News] Memory Manufacturers Vie for HBM3e Market

Recently, South Korean media Alphabiz reported that Samsung may exclusively supply 12-layer HBM3e to NVIDIA.

The report indicates NVIDIA is set to commence large-scale purchases of Samsung Electronics’ 12-layer HBM3e as early as September, who will exclusively provide the 12-layer HBM3e to NVIDIA.

NVIDIA CEO Jensen Huang, as per Alphabiz reported, left his signature “Jensen Approved” on a physical 12-layer HBM3e product from Samsung Electronics at GTC 2024, which seems to suggest NVIDIA’s recognition of Samsung’s HBM3e product.

HBM is characterized by its high bandwidth, high capacity, low latency, and low power consumption. With the surge in artificial intelligence (AI) industry, the acceleration of AI large-scale model applications has driven the continuous growth of demand in high-performance memory market.

According to TrendForce’s data, HBM market value accounted for approximately 8.4% of the overall DRAM industry in 2023, and this percentage is projected to expand to 20.1% by the end of 2024.

Senior Vice President Avril Wu notes that by the end of 2024, the DRAM industry is expected to allocate approximately 250K/m (14%) of total capacity to producing HBM TSV, with an estimated annual supply bit growth of around 260%.

HBM3e: Three Major Original Manufacturers Kick off Fierce Rivalry

Following the debut of the world’s first TSV HBM product in 2014, HBM memory technology has now iterated to HBM3e after nearly 10 years of development.

From the perspective of original manufacturers, competition in the HBM3e market primarily revolves around Micron, SK Hynix, and Samsung. It is reported that these three major manufacturers already provided 8-hi (24GB) samples in late July, mid-August, and early October 2023, respectively. It is worth noting that this year, they have kicked off fierce competition in the HBM3e market by introducing latest products.

On February 27th, Samsung announced the launch of its first 12-layer stacked HBM3e DRAM–HBM3e 12H, which marks Samsung’s largest-capacity HBM product to date, boasting a capacity of up to 36GB. Samsung stated that it has begun offering samples of the HBM3e 12H to customers and anticipates starting mass production in the second half of this year.

In early March, Micron announced that it had commenced mass production of its HBM3e solution. The company stated that the NVIDIA H200 Tensor Core GPU will adopt Micron’s 8-layer stacked HBM3e memory with 24GB capacity and shipments are set to begin in the second quarter of 2024.

On March 19th, SK Hynix announced the successful large-scale production of its new ultra-high-performance memory product, HBM3e, designed for AI applications. This achievement symbolizes the world’s first supply of DRAM’s highest-performance HBM3e in existence to customers.

A previous report from TrendForce has indicated that, starting in 2024, the market’s attention will shift from HBM3 to HBM3e, with expectations for a gradual ramp-up in production through the second half of the year, positioning HBM3e as the new mainstream in the HBM market.

TrendForce reports that SK hynix led the way with its HBM3e validation in the first quarter, closely followed by Micron, which plans to start distributing its HBM3e products toward the end of the first quarter, in alignment with NVIDIA’s planned H200 deployment by the end of the second quarter.

Samsung, slightly behind in sample submissions, is expected to complete its HBM3e validation by the end of the first quarter, with shipments rolling out in the second quarter. With Samsung having already made significant strides in HBM3 and its HBM3e validation expected to be completed soon, the company is poised to significantly narrow the market share gap with SK Hynix by the end of the year, reshaping the competitive dynamics in the HBM market.

Read more

(Photo credit: SK Hynix)

Please note that this article cites information from DRAMeXchange.

 

  • Page 9
  • 13 page(s)
  • 61 result(s)

Get in touch with us