HBM


2024-08-02

[News] Samsung’s Chip Head Raises the Urgency to Reform Company Culture to Avoid Vicious Cycles

According to a report from Bloomberg, Jun Young-hyun, head of Samsung’s chip business, recently sent a stern warning to employees about the need to reform the company’s culture to avoid falling into a vicious cycle.

Jun stated that the recent improvement in Samsung’s performance was due to a rebound in the memory market. To sustain this progress, Samsung must take measures to eliminate communication barriers between departments and stop concealing or avoiding problems.

Earlier this week, Samsung announced its Q2 earnings, showcasing the fastest net profit growth since 2010. However, Jun Young-hyun highlighted several issues which may undermine Samsung’s long-term competitiveness.

He emphasized the need to rebuild the semiconductor division’s culture of vigorous debate, warning that relying solely on market recovery without restoring fundamental competitiveness would lead to a vicious cycle and repeating past mistakes.

Samsung is still striving to close the gap with its competitors. The company is working to improve the maturity of its 2nm process to meet the high-performance, low-power demands of advanced processes. Samsung’s the first-generation 3nm GAA process has achieved yield maturity and is set for mass production in the second half of the year.

In memory, Samsung is beginning to narrow the gap with SK Hynix in high-bandwidth memory (HBM). According to Bloomberg, Samsung has received certification for HBM3 chips from NVIDIA and expects to gain certification for the next-generation HBM3e within two to four months.

Jun emphasized that although Samsung is in a challenging situation, he is confident that with accumulated experience and technology, the company can quickly regain its competitive edge.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Samsung and Bloomberg.
2024-08-01

[News] Samsung’s 8-layer HBM3e to Start Mass Production in Q3, Driving HBM Sales to Soar 3-5 Times in 2H24

Samsung Electronics, which has been surround by concerns that its HBM3e products are still struggling to pass NVIDIA’s qualifications, has confirmed in its second quarter earnings call that the company’s fifth-generation 8-layer HBM3e is currently undergoing customer valuation, and is scheduled to enter mass production in the third quarter, according to a report by Business Korea.

TrendForce notes that Samsung’s recent progress on HBM3e qualification seems to be solid, and we can soon expect both 8hi and 12hi to be qualified in the near future. The company is eager to gain higher HBM market share from SK hynix so its 1alpha capacity has reserved for HBM3e. TrendForce believes that Samsung is going to be a very important supplier on HBM category.

Driven by the momentum, the report from Business Korea, citing an official speaking at the conference call on July 31st, states that the share of HBM3e chips within Samsung’s HBMs is anticipated to surpass the mid-10 percent range in the third quarter. Moreover, it is projected to speedily grow to 60% by the fourth quarter.

According to Samsung, its HBM sales in the second quarter already grew by around 50% from the previous quarter. Being ambitious about its HBM3 and HBM3e sales, Samsung projects its HBM sales will increase three to five times in the second half of 2024, driven by a steep rise of about two times each quarter.

Samsung has already taken a big leap on HBM as its HBM3 chips are said to have been cleared by NVIDIA last week. According to a previous report by Reuters, Samsung’s HBM3 will initially be used exclusively in the AI giant’s H20, which is tailored for the Chinese market.

On the other hand, the South Korean memory giant notes that it has completed the preparations for volume production of its 12-layer HBM3e chips. The company plans to expand the supply in the second half of 2024 to meet the schedules requested by multiple customers, according to Business Korea. The progress of its sixth-generation HBM4 is also on track, scheduled to begin shipping in the second half of 2025, Business Korea notes.

Samsung Electronics reported higher-than-expected financial results in the second quarter, with a six-fold year-on-year increase in net income, soaring from KRW 1.55 trillion won (USD 1.12 billion) to KRW 9.64 trillion (USD 6.96 billion), as demand for its advanced memory chips that are crucial for AI training remained strong.

SK hynix, as the current HBM market leader, has expressed its optimism in securing the throne as well. The company reportedly expects its HBM3e shipments to surpass those of HBM3 in the third quarter, with HBM3e accounting for more than half of the total HBM shipments in 2024. In addition, it expects to begin supplying 12-layer HBM3e products to customers in the fourth quarter.

Micron, on the other hand, has reportedly started mass production of 8-layer HBM3e as early as in February. The company reportedly plans to complete preparations for mass production of 12-layer HBM3e in the second half and supply it to major customers like NVIDIA in 2025.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Business Korea and Reuters.
2024-07-31

[News] Samsung’s Q2 Profits Soar to USD 7.5 Billion, Seeing Strong Demand for HBM, DDR5 and Server SSD in 2H24

Samsung Electronics announced its financial results for the second quarter today (July 31st), posting KRW 74.07 trillion in consolidated revenue and operating profit of KRW 10.44 trillion (approximately USD 7.5 billion). The memory giant’s strong performance can be contributed to favorable memory market conditions, which drove higher average sales price (ASP), while robust sales of OLED panels also contributed to the results, according to its press release.

In early July, the company estimated a 15-fold increase YoY in second-quarter operating profit, which was expected to jump 1,452 per cent to KRW 10.4 trillion in preliminary numbers for the April-June quarter, the highest since the third quarter of 2022. The actual results are in line with its earlier projection.

Samsung’s DS Division posted KRW 28.56 trillion in consolidated revenue and KRW 6.45 trillion in operating profit for the second quarter, posting a 23.4% and 2377% QoQ growth, respectively.

Strong Demand for HBM, DDR5 and Server SSDs to Extend in Second Half on AI Applications

Regarding current market conditions, Samsung notes that driven by the strong demand for HBM as well as conventional DRAM and server SSDs, the memory market as a whole continued its recovery. This increased demand is a result of the continued AI investments by cloud service providers and growing demand for AI from businesses for their on-premise servers.

However, Samsung observes that PC demand was relatively weak, while demand for mobile products remained solid on the back of increased orders from Chinese original equipment manufacturer (OEM) customers. Demand from server applications continued to be robust.

Samsung projects that in the second half of 2024, AI servers are expected to take up a larger portion of the market as major cloud service providers and enterprises expand their AI investments. As AI servers equipped with HBM also feature high content-per-box with regards to conventional DRAM and SSDs, demand is expected to remain strong across the board from HBM and DDR5 to server SSDs.

In response to the heating market demand, Samsung plans to actively expand capacity to increase the portion of HBM3e sales. High-density products will be another major focus, such as server modules based on the 1b-nm 32Gb DDR5 in server DRAM.

Samsung has already taken a big leap on HBM as its HBM3 chips are said to have been cleared by NVIDIA last week, which will initially be used exclusively in the AI giant’s H20, a less advanced GPU tailored for the Chinese market.

For NAND, the company plans to increase sales by strengthening the supply of triple-level cell (TLC) SSDs, which are still a majority portion of AI demand, and will address customer demand for quad-level cell (QLC) products, which are optimized for all applications, including server PC and mobile.

The ramping of HBM and server DRAM production and sales is likely to further constrain conventional bit supply in both DRAM and NAND, Samsung notes.

Read more

Please note that this article cites information from Samsung.
2024-07-29

[News] SK hynix is Reportedly Considering U.S. IPO for its NAND/SSD Subsidiary Solidigm

South Korean memory giant SK hynix, after announcing soaring financial results in Q2 and its massive investment in Yongin Semiconductor Cluster last week, is now reportedly considering another move: US IPO for its Solidigm subsidiary.

According to the reports by Blocks & Files and Korea media outlet Hankyung, Solidigm has achieved its first profit after 12 consecutive quarters of losses. On July 25th, SK hynix announced second-quarter revenue of 16.42 trillion Korean won, a 125% year-on-year increase, setting a historical record. At the same time, profits reached their highest level since 2018. This was mainly due to strong demand for AI memory, including HBM, and overall price increases for DRAM and NAND products.

The reports stated that the rumor regarding the U.S. IPO seems to be plausible, as SK hynix had previously planned to spin off Solidigm, and the company’s recent rebound makes such a move more feasible. In addition, an IPO for Solidigm would allow SK hynix to obtain cash for part of its stake in the company and assist in covering the planned capital expenditures, according to the reports.

The company had just announced an ambitious plan of expanding its memory manufacturing capacity with an approximately 9.4 trillion won (USD 6.8 billion) investment to build an HBM fabrication plant at the Yongin Semiconductor Cluster, Korea. Construction of the fab will begin in March 2025 and is expected to be completed by May 2027. Following this, SK Hynix intends to add three more plants to the cluster.

However, the reports also pointed out that SK hynix’s success in this venture will likely depend on how the new organization is structured—such as which assets are included in Solidigm versus those retained by SK hynix—and how both entities address future technology plans. This is particularly important considering that the current roadmap for the memory giant’s NAND business at Dalian, China, including the QLC components that have contributed to Solidigm’s recent success in high-capacity enterprise SSDs, appears to conclude at 196 layers.

In 2020, SK hynix acquired Intel’s NAND and SSD division through a two-phase deal. The first phase involved the former purchasing Intel’s SSD business and NAND fabrication plant in Dalian, China, for USD 7 billion. The second phase will see SK hynix pay Intel an additional USD 2 billion in 2025 for intellectual property related to NAND flash wafer manufacturing and design, as well as for R&D employees and the Dalian fab workforce. SK hynix named the acquired business Solidigm in December, 2021, and has since developed and launched a few successful products, including the D5-P5336 61.44 TB QLC (4 bits/cell) SSD, the reports noted.

Regarding the rumor, SK hynix clarifies that Solidigm is exploring various growth strategies, but no decision has been made at this time.

Read more

(Photo credit: Solidigm)

Please note that this article cites information from Blocks & Files and Hankyung.
2024-07-29

[News] MRDIMM/MCRDIMM to be the New Sought-Afters in Memory Field

Amidst the tide of artificial intelligence (AI), new types of DRAM represented by HBM are embracing a new round of development opportunities. Meanwhile, driven by server demand, MRDIMM/MCRDIMM have emerged as new sought-afters in the memory industry, stepping onto the “historical stage.”

According to a report from WeChat account DRAMeXchange, currently, the rapid development of AI and big data is boosting an increase in the number of CPU cores in servers. To meet the data throughput requirements of each core in multi-core CPUs, it is necessary to significantly increase the bandwidth of memory systems. In this context, HBM modules for servers, MRDIMM/MCRDIMM, have emerged.

  • JEDEC Announces Details of the DDR5 MRDIMM Standard

On July 22, JEDEC announced that it will soon release the DDR5 Multiplexed Rank Dual Inline Memory Modules (MRDIMM) and the next-generation LPDDR6 Compression-Attached Memory Module (CAMM) advanced memory module standards, and introduced key details of these two types of memory, aiming to support the development of next-generation HPC and AI. These two new technical specifications were developed by JEDEC’s JC-45 DRAM Module Committee.

As a follow-up to JEDEC’s JESD318 CAMM2 memory module standard, JC-45 is developing the next-generation CAMM module for LPDDR6, with a target maximum speed of over 14.4GT/s. In light of the plan, this module will also provide 24-bit wide subchannels, 48-bit wide channels, and support “connector array” to meet the needs of future HPC and mobile devices.

DDR5 MRDIMM supports multiplexed rank columns, which can combine and transmit multiple data signals on a single channel, effectively increasing bandwidth without additional physical connections. It is reported that JEDEC has planned multiple generations of DDR5 MRDIMM, with the ultimate goal of increasing its bandwidth to 12.8Gbps, doubling the current 6.4Gbps of DDR5 RDIMM memory and improving pin speed.

In JEDEC’s vision, DDR5 MRDIMM will utilize the same pins, SPD, PMIC, and other designs as existing DDR5 DIMMs, be compatible with the RDIMM platform, and leverage the existing LRDIMM ecosystem for design and testing.

JEDEC stated that these two new technical specifications are expected to bring a new round of technological innovation to the memory market.

  • Micron’s MRDIMM DDR5 to Start Mass Shipment in 2H24

In March 2023, AMD announced at the Memcom 2023 event that it is collaborating with JEDEC to develop a new DDR5 MRDIMM standard memory, targeting a transfer rate of up to 17600 MT/s. According to a report from Tom’s Hardware at that time, the first generation of DDR5 MRDIMM aims for a rate of 8800 MT/s, which will gradually increase, with the second generation set to reach 12800 MT/s, and the third generation to 17600 MT/s.

MRDIMM, short for “Multiplexed Rank DIMM,” integrates two DDR5 DIMMs into one, thereby providing double the data transfer rate while allowing access to two ranks.

On July 16, memory giant Micron announced the launch of the new MRDIMM DDR5, which is currently sampling and will provide ultra-large capacity, ultra-high bandwidth, and ultra-low latency for AI and HPC applications. Mass shipment is set to begin in the second half of 2024.

MRDIMM offers the highest bandwidth, largest capacity, lowest latency, and better performance per watt. Micron said that it outperforms current TSV RDIMM in accelerating memory-intensive virtualization multi-tenant, HPC, and AI data center workloads.

Compared to traditional RDIMM DDR5, MRDIMM DDR5 can achieve an effective memory bandwidth increase of up to 39%, a bus efficiency improvement of over 15%, and a latency reduction of up to 40%.

MRDIMM supports capacity options ranging from 32GB to 256GB, covering both standard and high-form-factor (TFF) specifications, suitable for high-performance 1U and 2U servers. The 256GB TFF MRDIMM outruns TSV RDIMM with similar capacity by 35% in performance.

This new memory product is the first generation of Micron’s MRDIMM series and will be compatible with Intel Xeon processors. Micron stated that subsequent generations of MRDIMM products will continue to offer 45% higher single-channel memory bandwidth compared to their RDIMM counterparts.

  • SK hynix to Launch MCRDIMM Products in 2H24

As one of the world’s largest memory manufacturers, SK hynix already introduced a product similar to MRDIMM, called MCRDIMM, even before AMD and JEDEC.

MCRDIMM, short for “Multiplexer Combined Ranks2 Dual In-line Memory Module,” is a module product that combines multiple DRAMs on a substrate, operating the module’s two basic information processing units, Rank, simultaneously.

Source: SK hynix

In late 2022, SK hynix partnered with Intel and Renesas to develop the DDR5 MCR DIMM, which became the fastest server DRAM product in the industry at the time. As per Chinese IC design company Montage Technology’s 2023 annual report, MCRDIMM can also be considered the first generation of MRDIMM.

Traditional DRAM modules can only transfer 64 bytes of data to the CPU at a time, while SK hynix’s MCRDIMM module can transfer 128 bytes by running two memory ranks simultaneously. This increase in the amount of data transferred to the CPU each time boosts the data transfer speed to over 8Gbps, doubling that of a single DRAM.

At that time, SK hynix anticipated that the market for MCR DIMM would gradually open up, driven by the demand for increased memory bandwidth in HPC. According to SK hynix’s FY2024 Q2 financial report, the company will launch 32Gb DDR5 DRAM for servers and MCRDIMM products for HPC in 2H24.

  • MRDIMM Boasts a Brilliant Future

MCRDIMM/MRDIMM adopts the DDR5 LRDIMM “1+10” architecture, requiring one MRCD chip and ten MDB chips. Conceptually, MCRDIMM/MRDIMM allows parallel access to two ranks within the same DIMM, increasing the capacity and bandwidth of the DIMM module by a large margin.

Compared to RDIMM, MCRDIMM/MRDIMM can offer higher bandwidth while maintaining good compatibility with the existing mature RDIMM ecosystem. Additionally, MCRDIMM/MRDIMM is expected to enable much higher overall server performance and lower total cost of ownership (TCO) for enterprises.

MRDIMM and MCRDIMM both fall under the category of DRAM memory modules, which have different application scenarios relative to HBM as they have their own independent market space. As an industry-standard packaged memory, HBM can achieve higher bandwidth and energy efficiency in a given capacity with a smaller size. However, due to high cost, small capacity, and lack of scalability, its application is limited to a few fields. Thus, from an industry perspective, memory module is the mainstream solution for large capacity, cost-effectiveness, and scalable memory.

Montage Technology believes that, based on its high bandwidth and large capacity advantages, MRDIMM is likely to become the preferred main memory solution for future AI and HPC. As per JEDEC’s plan, the future new high-bandwidth memory modules for servers, MRDIMM, will support even higher memory bandwidth, further matching the bandwidth demands of HPC and AI application scenarios.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from Tom’s Hardware, Micron and WeChat account DRAMeXchange.

  • Page 7
  • 27 page(s)
  • 133 result(s)

Get in touch with us