HBM4


2024-08-28

[News] SK hynix Reportedly to Tape-out HBM4 in October, Paving the Way for NVIDIA’s Rubin

In mid-August, Samsung is said to be accelerating its progress on next-gen HBM, targeting to tape-out HBM4 by the end of this year. Now it seems SK hynix has maintained its competitive edge, as the company aims to tape out HBM4 in October, which will be used to power NVIDIA’s Rubin AI chips, according to the reports by Wccftech and ZDNet.

In addition, the reports note that SK hynix also plans to tape out HBM4 for AMD’s AI chips, which is expected to take place a few month later.

To further prepare for the strong demand from AI chip giants’ upcoming product launch, SK hynix is assembling development teams to supply HBM4 to NVIDIA and AMD, according to Wccftech and ZDNet.

Per SK hynix’s product roadmap, the company plans to launch 12-layer stacked HBM4 in the second half of 2025 and 16-layer in 2026. With NVIDIA’s Rubin series planned for 2026, it is expected to adopt HBM4 12Hi with 8 clusters per GPU.

SK hynix is the major HBM3e supplier for NVIDIA’s AI chips, as the memory giant has taken the lead by starting shipping the product a few months ago, followed by Micron. Samsung’s HBM3, on the other hand, have been cleared by NVIDIA in July, while its HBM3e is still striving to pass NVIDIA’s qualification.

According to the reports, the introduction of HBM4 represents another major milestone for SK hynix, as it offers the fastest DRAM with exceptional power efficiency and higher bandwidth.

HBM4 will feature double the channel width of HBM3E, offering 2048 bits versus 1024 bits. Moreover, it supports stacking 16 DRAM dies, up from 12 in HBM3e, with options for 24Gb and 32Gb layers. This advancement will enable a capacity of 64GB per stack, compared to 32GB with HBM3e, the reports suggest.

On August 19, SK hynix showcased the ambition on securing its leadership on HBM, claiming that the company is developing a product with performance up to 30 times higher than current HBM.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from ZDNet and  Wccftech.
2024-08-23

[News] SK Hynix President to Join Semicon Taiwan, Bolstering HBM Partnership with TSMC and NVIDIA

According to a report from the Commercial Times, SK hynix is expected to announce a plan of closer collaboration with TSMC and NVIDIA during the Semicon Taiwan exhibition in September, which is likely to focusing on the development of next-generation HBM. This partnership is expected to further strengthen their leadership in the supply of critical components for AI servers.

Semicon Taiwan will be held from September 4 to 6, and sources cited by the same report indicate that SK hynix President Justin Kim will attend the event and deliver a keynote speech for the first time.

Upon arriving in Taiwan, Justin Kim is expected to meet with TSMC executives. The report, citing rumors, suggests that NVIDIA CEO Jensen Huang might also join the meeting, further strengthening the alliance among the tech giants.

The core of this collaboration will revolve around HBM technology. In the past, SK hynix used its own processes to manufacture base dies up to HBM3e (the fifth-generation HBM).

However, industry sources cited by the report reveal that SK hynix will adopt TSMC’s logic process to manufacture the base die starting from HBM4, which would allow the memory giant to customize products for its clients in terms of performance and efficiency.

Industry sources cited by the report also indicate that SK hynix and TSMC have agreed to collaborate on the development and production of HBM4, scheduled for mass production in 2026.

This collaboration will reportedly involve manufacturing HBM4 interface chips using 12FFC+ (12nm class) and 5nm processes to achieve smaller interconnect spacing and enhance memory performance for AI and high-performance computing (HPC) processors.

Per SK hynix’s product roadmap, the company plans to launch a 12-layer stacked HBM4 in the second half of 2025 and 16-layer in 2026. TSMC, on the other hand, is also working to strengthen and expand its CoWoS-L and CoWoS-R packaging capacity to support the large-scale production of HBM4.

SK hynix has been the major supplier of HBM for NVIDIA’s AI GPUs, and with the upcoming Rubin series planned for 2026, it is expected to adopt HBM4 12Hi with 8 clusters per GPU. This partnership between SK hynix, TSMC and NVIDIA, therefore, is expected to expanding its influence and widening the gap with Samsung.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Commercial Times.

2024-08-23

[News] GUC’s HBM4 IP Ready, But CSP Adoption Timing Unclear

HBM4, the sixth generation of HBM, is poised to become the key to breakthroughs in computing power for next-generation CSPs (Cloud Service Providers). According to a report from Commercial Times citing Global Unichip Corp. (GUC), to support the development of HBM4, their semiconductor IP (Intellectual Property) is already prepared and awaiting CSP manufacturers to advance their manufacturing processes.

GUC pointed out that if future clients need to integrate general-purpose HBM4 into ASICs (Application-Specific Integrated Circuits), GUC can provide assistance.

GUC further emphasized that its IP is ready for HBM4 development, waiting for CSPs to advance their manufacturing processes. Currently, the ASICs being mass-produced by CSPs still use HBM2 or HBM2e, while HBM3 is in the R&D stage.

The company candidly acknowledged that it cannot play any role at the moment and needs to wait for CSPs to adopt HBM4 on a large scale, taking cost considerations into account. When that time comes, GUC expects to assist CSPs in designing their solutions.

Currently, SK hynix has the technological capability for the general-purpose base die used in HBM4. However, when moving to more advanced processes like 5nm or beyond, external design service providers will be required.

Industry sources cited by Commercial Times believe that the pace of advancements in computing power is accelerating.

For instance, Google’s sixth-generation TPU, expected to be launched by the end of this year, is already based on TSMC’s 4nm process and designed on the Arm architecture.

Similarly, Meta’s upcoming MTIAv2 is built on TSMC’s 5nm process. The trend toward developing in-house chips is characterized by lower power consumption and larger memory capacities.

Read more

(Photo credit: GUC)

Please note that this article cites information from Commercial Times.

2024-08-23

[News] HBM Technological Battle to be on the Next Level

Per a report by BusinessKorea, SK hynix Vice President Ryu Seong-su announced the company’s strategic plan in the HBM field during the SK Group Icheon Forum 2024 held on August 19. SK hynix plans to develop a product that boasts dozen of times the performance of existing HBM technologies.

The report indicates that SK hynix aims to develop a product with performance 20 to 30 times higher than current HBM offerings, achieving product differentiation.

During the forum, Ryu Seong-su emphasized that SK hynix will concentrate on leveraging advanced execution capabilities to provide memory solutions tailored for the AI (Artificial Intelligence) sector to meet the demands of the mass market.

  • Seven Giants Proposed Customized Demands for HBM

Amid AI advancements, the demand for high-performance HBM has been on the rise, making it a hotspot among global high-tech companies.

According to Ryu Seong-su, Apple, Microsoft, Google Alphabet, Amazon, NVIDIA, Meta, and Tesla—seven of the world’s tech giants—have all engaged with SK hynix , seeking customized HBM solutions tailored to their specific needs.

Compared to existing HBM products, customized HBM offers clients more options in terms of PPA (Performance, Power, Area), thereby delivering more substantial value.

For example, Samsung believes that the power consumption and area of semiconductors can be largely reduced by stacking HBM memory with custom logic chips in a 3D configuration.

On this trend towards customization in the HBM sector, TrendForce predicted that HBM industry will become more customization-oriented in the future. Unlike other DRAM products, HBM will increasingly break away from the standard DRAM framework in terms of pricing and design, turning to more specialized production.

SK hynix CEO Kwak Noh-Jung also believes that as HBM4 continues to advance, the demand for customization will grow, which is likely to become a global trend and shift towards a more contract-based model. Moreover, it is expected to mitigate the risk of oversupply in the memory market.

In fact, HBM market is gradually evolving from a “general-purpose” to a “customization-oriented” market with the rise of AI. Later, as breakthroughs are made in speed, capacity, power consumption, and cost, HBM is poised to play an even more critical role in the AI sector.

  • Customized HBM to Become a Reality in the HBM4 Generation

Currently, buyers have already begun making customized requests for HBM4, and both SK hynix and Samsung Electronics have developed strategies to address these demands.

SK hynix has been in collaboration with TSMC to develop the sixth generation of HBM products, known as HBM4, which is expected to enter production in 2026.

Unlike previous generations, inclusive of the fifth-generation HBM3E, which were based on SK hynix’s own process technology, HBM4 will leverage TSMC’s advanced logic process, which is anticipated to significantly enhance the performance of HBM products.

Additionally, adopting ultra-fine processing technology for the base die could enable the addition of more features.

SK hynix has stated that with these two major technological upgrades, the company plans to produce HBM products that excel in performance and efficiency, thereby meeting the demand for customized HBM solutions.

Ryu Seong-su believes that as customized products enjoy burgeoning growth, memory industry is approaching a critical point of paradigm shift, and SK hynix will continue to make advantage of the opportunities presented by these changes to advance its memory business.

Meanwhile, Samsung Electronics, as a leading IDM semiconductor company with capabilities in wafer foundry, memory, and packaging, is also actively promoting customized HBM AI solutions.

In July 2024, Choi Jang-seok, head of the new business planning group at Samsung Electronics’ memory division, stated at the “Samsung Foundry Forum” that the company intends to develop a variety of customized HBM memory products for the HBM4 generation and announced collaborations with major clients like AMD and Apple.

Choi Jang-seok pointed out that the HBM architecture is undergoing profound changes, with many customers shifting from traditional general-purpose HBM to customized products. Samsung Electronics believes that customized HBM will become a reality in the HBM4 generation.

Read more

(Photo credit: Micron)

Please note that this article cites information from BusinessKorea and WeChat account DRAMeXchange.

2024-08-19

[News] Samsung Reportedly to Tape out HBM4 with 1c DRAM by Year-end

After forming a new HBM development team within its Device Solutions (DS) Division around July, memory Giant Samsung is now said to have made progress on HBM4, targeting to tape-out the product by the end of this year, a report by TheElec notes. The move is also regarded to be laying the foundation stone for the mass production of its 12-layer HBM4 product by the end of 2025, according to the report.

The report suggests that as the time span between tape-out and finalizing test products might take three to fourth months, Samsung’s HBM4 test products are expected to be released next year at the earliest. Afterwards, Samsung would continue to make improvements until sending samples to key customers.

Samsung, however, declined to comment on its roadmap, according to TheElec.

The report by TheElec further notes that starting from HBM4, Samsung plans to mass-produce the logic die of HBM on its 4nm foundry process. Regarding the memory chip, Samsung is said to adopt the 10nm 6th-generation (1c) DRAM.

Samsung’s major HBM competitor, SK hynix, is reported to enter mass production for its 12-layer HBM4 in the second half of 2025, the report indicates. The company plans to mass-produce the logic die of HBM with TSMC’s 5nm and 12nm processes. As for the memory chip, it is still weighing between 1b DRAM and 1c DRAM.

As Samsung plans to use 1c DRAM in HBM4 core chips, related investments are expected to follow. TrendForce reports that Samsung’s P4L facility will be the key site for expanding memory capacity starting in 2025, starting with NAND production. Equipment installation for DRAM is expected to begin in mid-2025, with mass production of 1c nanometer DRAM slated to commence in 2026.

Samsung’s fifth generation HBM, HBM3e, is still striving on the certification process with NVIDIA. TrendForce notes that as the company is eager to gain higher HBM market share from SK hynix, its 1alpha(1α) capacity has reserved for HBM3e. TrendForce believes that Samsung is going to be a very important supplier on HBM category.

Read more

(Photo credit: Samsung)

Please note that this article cites information from TheElec.
  • Page 3
  • 7 page(s)
  • 33 result(s)

Get in touch with us