HBM4


2024-11-12

[News] Samsung Reportedly Starts Developing Custom HBM4 for Big CSPs, Eyeing Mass Production by 2025

As Samsung has implied earlier that it plans to collaborate with TSMC on HBM4, the memory giant seems to take a key step forward. A report by South Korean media outlet Maeli Business Newspaper discloses that the memory giant has already begun developing “Custom HBM4,” a next-gen high-bandwidth memory tailored specifically for Microsoft and Meta.

Samsung aims to begin mass production of HBM4 promptly upon completing development by the end of 2025, the report suggests.

Industrial sources cited by the report state that Samsung is establishing a dedicated production line for HBM4, which is now in the “pilot production” phase, where small-scale, trial manufacturing takes place ahead of mass production.

Citing sources familiar with the situation, the report further notes that Samsung is actively working on HBM4 designed for Microsoft and Meta. Both tech heavyweights have their own AI chips, namely, Microsoft’s Maia 100 and Meta’s Artemis.

As major tech companies make strides in scaling up their own AI data centers, there is a strong demand to cut costs associated with chip purchases. Therefore, many design and utilize their own AI accelerators while also buying AI chips from NVIDIA and AMD, according to Maeli.

Samsung, with its memory division and an LSI division capable of designing computational chips, is ideally positioned as a top partner for these major tech companies, according to Maeli.

Though the specifics of the HBM4 product that Samsung will supply to these companies remain undisclosed, Samsung did reveal the general specifications in February, according to Maeli.

Its data transmission speed, or bandwidth, reportedly reaches 2 terabytes per second (TB), marking a 66% increase over HBM3E, while its capacity has risen 33% to 48 gigabytes (GB), up from 36GB.

The report explains that unlike previous memory products, HBM4 is also referred to as “Computer-in-Memory (CIM)” due to its advanced capabilities, which go beyond standard memory functions to include specialized operations aligned with customer requirements.

Up to HBM3, the focus is said to be mainly on managing heat generation and maximizing speed, according to a semiconductor industrial official cited by the report. With HBM4, however, the emphasis will shift toward integrating AI computation capabilities (NPU) and specific data processing functions (IP) to meet evolving needs, the report says.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Maeli Business Newspaper.
2024-11-04

[News] NVIDIA CEO Jensen Huang Reportedly Asked SK hynix to Expedite HBM4 Supply by 6 Months

While introducing the industry’s first 48GB 16-high HBM3E at SK AI Summit in Seoul today, South Korean memory giant SK hynix has reportedly seen strong demand for its next-gen HBM. According to reports by Reuters and South Korean media outlet ZDNet, NVIDIA CEO Jensen Huang requested SK hynix to accelerate the supply of HBM4 by six months.

The information was disclosed by SK Group Chairman Chey Tae-won earlier today at the SK AI Summit, according to the reports. In October, the company said that it planned to deliver the chips to customers in the second half of 2025, according to the reports.

When asked by ZDNet about HBM4’s accelerated timetable, SK hynix President Kwak Noh-Jung responded by saying “We will give it a try.”

A spokesperson for SK hynix cited by Reuters noted that this new timeline is quicker than their original target, but did not provide additional details.

According to ZDNet, NVIDIA CEO Jensen Huang also made his appearance in a video interview at the Summit, stating that by collaborating with SK hynix, NVIDIA has been able to achieve progress beyond Moore’s Law, and the company will continue to need more of SK hynix’s HBM in the future.

According to the third-quarter financial report released by SK hynix in late October, the company posted record-breaking figures, including revenues of 17.5731 trillion won, an operating profit of 7.03 trillion won (with an operating margin of 40%), and a net profit of 5.7534 trillion won (with a net margin of 33%) for the third quarter of this year.

In particular, HBM sales showed excellent growth, up more than 70% from the previous quarter and more than 330% from the same period last year.

SK hynix is indeed making strides in its HBM, as it started mass production of the world’s first 12-layer HBM3E product with 36GB in September. It has also been developing 48GB 16-high HBM3E in a bid to secure technological stability and plans to provide samples to customers early next year, according to the company’s press release.

On the other hand, according to another report by Business Korea, Kim Jae-jun, Vice President of the Memory Business Division, stated In the earnings call that the company is mass-producing and selling both HBM3E 8-stack and 12-stack products, and have completed key stages of the quality testing process for a major customer. Though Kim did not specify the identity of the major customer, industry analysts suggest it is likely NVIDIA.

To shorten the technology gap with SK hynix, Samsung is reportedly planning to produce the next-generation HBM4 products in the latter half of next year.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from ReutersZDNet and Business Korea.
2024-11-01

[News] Samsung Advances Key HBM Supply, Hints at TSMC Partnership

Samsung Electronics released its third-quarter earnings on October 31, reporting a sharper-than-expected profit despite a substantial decline in profits from its flagship semiconductor business. Notably, Samsung’s senior management emphasized its continued focus on high-end chip production and disclosed progress in major supply deals. This includes a potential NVIDIA certification for its HBM3E which could boost performance in the fourth quarter.

According to reports from Commercial Times, Samsung Executive Vice President Jaejune Kim addressed analysts about high-end memory chips used in AI chipsets, stating that while they previously mentioned a delay in HBM3E’s commercialization, they have made meaningful progress in product certification with key clients. As a result, they expect HBM3E sales to improve in the fourth quarter and plan to expand sales to multiple customers.

Though Samsung did not disclose client names, analysts believe this certification likely refers to NVIDIA, which commands 80% of the global AI chip market.

According to Economic Daily News, Samsung reported significant revenue growth in high-bandwidth memory (HBM), DDR5, and server storage products, with expectations for improved performance in its semiconductor business this quarter.

Although demand for mobile and PC memory chips may decline, the growth in AI is expected to sustain robust demand. Demand for AI and data center products, including memory for both AI and traditional servers, is projected to remain strong and stable through next year.

Additionally, Kim tated that the company would flexibly reduce production of traditional DRAM and NAND chips to align with market demand and expedite the shift to advanced process nodes.

The same report from Economic Daily News indicated that Samsung plans to develop and mass-produce HBM4 in the second half of this year. Next year, its memory division will focus on HBM and server SSDs, and there are hints of potential collaboration with TSMC to meet the diverse needs of HBM clients.

(Photo credit: Samsung)

Please note that this article cites information from Commercial Times and Economic Daily News.

2024-10-24

[News] SK hynix and Samsung Reportedly Step up Focus on HBM4 and CXL amid Rising Chinese Competition

Ahead of SK hynix’s Q3 earnings announcement on October 24th, the market expects it may see a surge in quarterly operating profit driven by HBM, potentially leading the company to outperform Samsung’s semiconductor division. Therefore, there is growing interest in SK hynix’s next move.

In order to maintain its leadership in the memory sector amid heated competition from China, SK hynix is reportedly shifting its focus to high-value technologies such as HBM4 and Compute Express Link (CXL), according to Korean media outlet Pinpoint News and Tom’s Hardware.

This shift is motivated by a highly competitive memory market, where Chinese firms are ramping up their production capabilities and adopting aggressive pricing strategies to gain share, Tom’s Hardware notes.

For instance, a previous report by ZDNet mentions that Chinese memory manufacturers like CXMT (Changxin Memory Technologies) are aggressively expanding production, which could negatively affect profitability in the traditional DRAM market. Established in 2016, CXMT has become China’s largest DRAM producer with government backing.

Both Samsung and SK hynix are said to be closely monitoring these developments, and counting on high-valued technologies like HBM4 and CXL to unlock a new wave of growth momentum.

It is worth noting that both memory giants have teamed up with TSMC on HBM4, as they attempt to incorporate customized features requested by major clients, counting on TSMC to manufacture more powerful logic dies, the component that functions as the brain of an HBM chip.

Per SK hynix’s product roadmap, the company plans to launch 12-layer stacked HBM4 in the second half of 2025 and 16-layer in 2026.

Samsung, which is struggling with the 12-Hi HBM3e verification with NVIDIA, also aims high for HBM4 to turn the tide. A previous report by The Elec indicates that Samsung targets to tape out HBM4 by year-end, while eyeing the mass production by the end of 2025.

On the other hand, CXL is a next-generation interface that efficiently connects CPUs, GPUs, and memory in high-performance computing systems. According to Pinpoint News, by applying CXL to existing memory modules, capacities can be expanded by more than ten times, makes it extremely suited for the demand of the AI era.

SK hynix is also focusing on CXL memory, which is gaining attention as the next-generation AI memory following HBM. Citing SK hynix CEO Kwak Noh-Jung’s remarks, a report by ZDNet suggests that the memory giant plans to launch products like CXL and LPCAMM tailored to customers’ needs, as the results will begin to materialize around next year.

In the meantime, Samsung reportedly aims to begin mass production of a 256GB CMM-D, compatible with the CXL 2.0 protocol, by the end of 2024, according to Tom’s Hardware. In its own words, Samsung’s CMM-D is a memory expander built with next-generation CXL technology, which seamlessly connects multiple processors and devices, increasing memory capacity thus optimizing memory management.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Pinpoint News, Tom’s Hardware, The ElecZDNet and Samsung.
2024-10-14

[News] Samsung’s Next Move? Early HBM4 Mass-production and 2nm Foundry Solutions Could Be the Cure

After reporting disappointing third-quarter earnings forecast, Samsung’s next move has become the center of market attention. According to a report by Business Korea, to turn the situation around, Samsung may shift its strategy focus to early HBM4 mass production, as well as targeting advanced foundry solutions below 2nm.

A couple of days ago, Samsung warned its third-quarter profit would probably reach 9.1 trillion won, falling short of market expectations. Jeon Young-hyun, the head of Samsung’s Device Solutions (DS) division, issued an unusual public apology in the meantime.

Citing industry sources, Business Korea notes that Samsung’s DS division is expected to post an operating profit of around 5 trillion won (about USD 3.8 billion) for the third quarter, which is reportedly below the market expectation of 6 trillion won. The figure is significantly lower than SK hynix’s projected quarterly operating profit, which is expected to be in the high 6 trillion won range, according to the report.

Samsung May Accelerate HBM4 Progress to Turn the Tide

The series of setbacks have prompted the struggling giant to take action. As Samsung’s lackluster performance could be attributed to its delay in supplying NVIDIA with its 12-layer HBM3e product, industry insiders cited by Business Korea suggest that accelerating the mass production of HBM4, as well as introducing 2nm foundry solutions, could just be the remedies Samsung needs.

In terms of the HBM market, in which Samsung is lagging behind SK hynix on HBM3e verification, the report indicates that Samsung is expected to prioritize the early mass production of HBM4, which is projected to become mainstream in 2025.

A source familiar with the situation told Business Korea that HBM orders from companies other than NVIDIA would rise next year. Major tech firms, including AMD, Amazon, Microsoft, Google, and Qualcomm, are also working on AI semiconductors. Therefore, it does not necessarily mean that Samsung should concentrate solely on NVIDIA, and it could accelerate supply contracts with NVIDIA’s competitors, the report notes.

TrendForce’s latest findings indicate that Samsung, SK hynix, and Micron have all submitted their first HBM3e 12-Hi samples in the first half and third quarter of 2024, respectively, and are currently undergoing validation. SK hynix and Micron are making faster progress and are expected to complete validation by the end of this year.

2nm Advancements Would be Another Focus

On the other hand, in terms of the foundry sector, the report suggests that Samsung is expected to further enhance its ‘turnkey order’ strategy. This approach addresses concerns about technology leakage while providing HBM as part of a comprehensive package.

According to the report, Samsung is set to begin mass production of its GAA 2nm process in 2025. The company also aims to complete the development of the 2nm process with Backside Power Delivery Network (BSPDN) technology by 2027. Having secured 2nm orders from Japan’s AI unicorn Preferred Networks (PFN) and U.S. AI semiconductor company Ambarella, Samsung reportedly plans to seek collaboration with major tech firms.

To attract customers, Samsung will host the “Foundry Forum 2024” online on October 24. Previously scheduled to be held in Beijing, the event will now be conducted virtually, which aligns with the company’s efforts to reduce costs. Will it make further progress in advanced nodes? The whole world is watching.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Business Korea.
  • Page 1
  • 7 page(s)
  • 33 result(s)

Get in touch with us