News

[News] Samsung Reportedly Starts Developing Custom HBM4 for Big CSPs, Eyeing Mass Production by 2025


2024-11-12 Semiconductors editor

As Samsung has implied earlier that it plans to collaborate with TSMC on HBM4, the memory giant seems to take a key step forward. A report by South Korean media outlet Maeli Business Newspaper discloses that the memory giant has already begun developing “Custom HBM4,” a next-gen high-bandwidth memory tailored specifically for Microsoft and Meta.

Samsung aims to begin mass production of HBM4 promptly upon completing development by the end of 2025, the report suggests.

Industrial sources cited by the report state that Samsung is establishing a dedicated production line for HBM4, which is now in the “pilot production” phase, where small-scale, trial manufacturing takes place ahead of mass production.

Citing sources familiar with the situation, the report further notes that Samsung is actively working on HBM4 designed for Microsoft and Meta. Both tech heavyweights have their own AI chips, namely, Microsoft’s Maia 100 and Meta’s Artemis.

As major tech companies make strides in scaling up their own AI data centers, there is a strong demand to cut costs associated with chip purchases. Therefore, many design and utilize their own AI accelerators while also buying AI chips from NVIDIA and AMD, according to Maeli.

Samsung, with its memory division and an LSI division capable of designing computational chips, is ideally positioned as a top partner for these major tech companies, according to Maeli.

Though the specifics of the HBM4 product that Samsung will supply to these companies remain undisclosed, Samsung did reveal the general specifications in February, according to Maeli.

Its data transmission speed, or bandwidth, reportedly reaches 2 terabytes per second (TB), marking a 66% increase over HBM3E, while its capacity has risen 33% to 48 gigabytes (GB), up from 36GB.

The report explains that unlike previous memory products, HBM4 is also referred to as “Computer-in-Memory (CIM)” due to its advanced capabilities, which go beyond standard memory functions to include specialized operations aligned with customer requirements.

Up to HBM3, the focus is said to be mainly on managing heat generation and maximizing speed, according to a semiconductor industrial official cited by the report. With HBM4, however, the emphasis will shift toward integrating AI computation capabilities (NPU) and specific data processing functions (IP) to meet evolving needs, the report says.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Maeli Business Newspaper.

Get in touch with us