News
While introducing the industry’s first 48GB 16-high HBM3E at SK AI Summit in Seoul today, South Korean memory giant SK hynix has reportedly seen strong demand for its next-gen HBM. According to reports by Reuters and South Korean media outlet ZDNet, NVIDIA CEO Jensen Huang requested SK hynix to accelerate the supply of HBM4 by six months.
The information was disclosed by SK Group Chairman Chey Tae-won earlier today at the SK AI Summit, according to the reports. In October, the company said that it planned to deliver the chips to customers in the second half of 2025, according to the reports.
When asked by ZDNet about HBM4’s accelerated timetable, SK hynix President Kwak Noh-Jung responded by saying “We will give it a try.”
A spokesperson for SK hynix cited by Reuters noted that this new timeline is quicker than their original target, but did not provide additional details.
According to ZDNet, NVIDIA CEO Jensen Huang also made his appearance in a video interview at the Summit, stating that by collaborating with SK hynix, NVIDIA has been able to achieve progress beyond Moore’s Law, and the company will continue to need more of SK hynix’s HBM in the future.
According to the third-quarter financial report released by SK hynix in late October, the company posted record-breaking figures, including revenues of 17.5731 trillion won, an operating profit of 7.03 trillion won (with an operating margin of 40%), and a net profit of 5.7534 trillion won (with a net margin of 33%) for the third quarter of this year.
In particular, HBM sales showed excellent growth, up more than 70% from the previous quarter and more than 330% from the same period last year.
SK hynix is indeed making strides in its HBM, as it started mass production of the world’s first 12-layer HBM3E product with 36GB in September. It has also been developing 48GB 16-high HBM3E in a bid to secure technological stability and plans to provide samples to customers early next year, according to the company’s press release.
On the other hand, according to another report by Business Korea, Kim Jae-jun, Vice President of the Memory Business Division, stated In the earnings call that the company is mass-producing and selling both HBM3E 8-stack and 12-stack products, and have completed key stages of the quality testing process for a major customer. Though Kim did not specify the identity of the major customer, industry analysts suggest it is likely NVIDIA.
To shorten the technology gap with SK hynix, Samsung is reportedly planning to produce the next-generation HBM4 products in the latter half of next year.
Read more
(Photo credit: NVIDIA)
News
South Korean memory giant SK hynix has introduced the industry’s first 48GB 16-high HBM3E at SK AI Summit in Seoul today, which is the world’s highest number of layers followed by the 12-high product, according to its press release.
According to SK hynix CEO Kwak Noh-Jung, though the market for 16-high HBM is expected to open up from the HBM4 generation, SK hynix has been developing 48GB 16-high HBM3E in a bid to secure technological stability and plans to provide samples to customers early next year, the press release noted.
In late September, SK hynix announced that it has begun mass production of the world’s first 12-layer HBM3E product with 36GB.
On the other hand, SK hynix is expected to apply Advanced MR-MUF process, which enabled the mass production of 12-high products, to produce 16-high HBM3E, while also developing hybrid bonding technology as a backup, Kwak explained.
According to Kwak, SK hynix’s 16-high products come with performance improvement of 18% in training, 32% in inference vs 12-high products.
Kwak Noh-Jung made the introduction of SK hynix’s 16-high HBM3E during his keynote speech at SK AI Summit today, titled “A New Journey in Next-Generation AI Memory: Beyond Hardware to Daily Life.” He also shared the company’s vision to become a “Full Stack AI Memory Provider”, or a provider with a full lineup of AI memory products in both DRAM and NAND spaces, through close collaboration with interested parties, the press release notes.
It is worth noting that SK hynix highlighted its plans to adopt logic process on base die from HBM4 generation through collaboration with a top global logic foundry to provide customers with best products.
A previous press release in April notes that SK hynix has signed a memorandum of understanding with TSMC for collaboration to produce next-generation HBM and enhance logic and HBM integration through advanced packaging technology. The company plans to proceed with the development of HBM4, or the sixth generation of the HBM family, slated to be mass produced from 2026, through this initiative.
To further expand the memory giant’s product roadmap, it is developing LPCAMM2 module for PC and data center, 1cnm-based LPDDR5 and LPDDR6, taking full advantage of its competitiveness in low-power and high-performance products, according to the press release.
The company is readying PCIe 6th generation SSD, high-capacity QLC-based eSSD and UFS 5.0.
As powering AI system requires a sharp increase in capacity of memory installed in servers, SK hynix revealed in the press release that it is preparing CXL Fabrics that enables high capacity through connection of various memories, while developing eSSD with ultra-high capacity to allow more data in a smaller space at low power.
SK hynix is also developing technology that adds computational functions to memory to overcome so-called memory wall. Technologies such as Processing near Memory(PNM), Processing in Memory(PIM), Computational Storage, essential to process enormous amount of data in future, will be a challenge that transforms structure of next-generation AI system and a future of AI industry, according to the press release.
Read more
(Photo credit: SK hynix)
News
Media report that researchers at the University of Chicago and Argonne National Laboratory (ANL) have developed a new optical storage technology that could surpass the density limitations of traditional optical disks, achieving ultra-high-density storage.
In the past, a key challenge for traditional optical storage was the diffraction limit of light. Since the size of each data unit cannot be smaller than the wavelength of the read-write laser beam, there is an upper limit to the density of current optical storage.
The researchers proposed a method to circumvent this limitation by using wavelength-division multiplexing. They embedded rare-earth emitters, such as magnesium oxide crystals, within the material. Each emitter uses a slightly different wavelength, allowing more data to be stored within the same physical space.
The report states that researchers initially modeled and simulated the physical principles of this technology and designed a theoretical solid material containing rare-earth atoms. This material can absorb and re-emit photons, while nearby quantum defects can capture and store these photons. One significant discovery was that when defects absorb narrow-wavelength energy from nearby atoms, their spin state flips. Once the spin state flips, it is nearly impossible to revert, which means these defects can store data for an extended period.
The report emphasizes that, although this is a promising initial test, there are still some key issues to address before commercialization. For instance, the durability of these emitters needs to be verified. Additionally, the researchers have not provided specific capacity estimates, merely suggesting the potential for “ultra-high density.” Despite these challenges, the researchers are optimistic about the future of this technology, calling it a major advancement in storage technology.
(Photo credit: IBM)
News
Samsung Electronics released its third-quarter earnings on October 31, reporting a sharper-than-expected profit despite a substantial decline in profits from its flagship semiconductor business. Notably, Samsung’s senior management emphasized its continued focus on high-end chip production and disclosed progress in major supply deals. This includes a potential NVIDIA certification for its HBM3E which could boost performance in the fourth quarter.
According to reports from Commercial Times, Samsung Executive Vice President Jaejune Kim addressed analysts about high-end memory chips used in AI chipsets, stating that while they previously mentioned a delay in HBM3E’s commercialization, they have made meaningful progress in product certification with key clients. As a result, they expect HBM3E sales to improve in the fourth quarter and plan to expand sales to multiple customers.
Though Samsung did not disclose client names, analysts believe this certification likely refers to NVIDIA, which commands 80% of the global AI chip market.
According to Economic Daily News, Samsung reported significant revenue growth in high-bandwidth memory (HBM), DDR5, and server storage products, with expectations for improved performance in its semiconductor business this quarter.
Although demand for mobile and PC memory chips may decline, the growth in AI is expected to sustain robust demand. Demand for AI and data center products, including memory for both AI and traditional servers, is projected to remain strong and stable through next year.
Additionally, Kim tated that the company would flexibly reduce production of traditional DRAM and NAND chips to align with market demand and expedite the shift to advanced process nodes.
The same report from Economic Daily News indicated that Samsung plans to develop and mass-produce HBM4 in the second half of this year. Next year, its memory division will focus on HBM and server SSDs, and there are hints of potential collaboration with TSMC to meet the diverse needs of HBM clients.
(Photo credit: Samsung)
News
Amid concerns on its HBM progress and yield issues on advanced nodes, Samsung has released its full Q3 2024 financial results, with the quarterly revenue reaching KRW 79.1 trillion won (approximately $57.35 billion), hitting an all-time high. However, its semiconductor business remains lackluster, as the DS Division recorded a quarterly operating profit of 3.86 trillion won, marking a 40% decline from the previous quarter.
According to a report by CNBC, while demand for memory chips driven by AI and traditional server products provided some support, Samsung noted that “inventory adjustments negatively impacted mobile demand.” The company also highlighted challenges with “the increasing supply of legacy products in China.”
Samsung continues to face challenges in its most advanced wafer foundry processes. According to TrendForce, the company has yet to solidify its reputation as a reliable partner for cutting-edge nodes, which may hinder its ability to secure orders from top IC design houses and potentially delay its efforts to expand capacity.
Losses in Foundry and System Chip Lead to Profit Drop in DS Division, while Memory Remains Strong
On October 31, Samsung Electronics reported Q3 consolidated revenue of KRW 79.1 trillion, an increase of 7% from the previous quarter, on the back of the launch effects of new smartphone models and increased sales of high-end memory products. According to Business Korea, the Q3 revenue exceeded its previous revenue record of KRW 77.78 trillion, set in Q1 2022.
However, operating profit declined to KRW 9.18 trillion, largely due to one-off costs, including the provision of incentives in the Device Solutions (DS) Division, according to its press release.
In terms of the DS Division, which encompasses the memory and foundry business, it posted KRW 29.27 trillion in consolidated revenue and KRW 3.86 trillion in operating profit in the third quarter, marking almost a 50% drop from the prior quarter’s KRW 6.45 trillion.
According the Korean Economic Daily, Samsung attributed the weaker profit to higher-than-anticipated one-time expenses totaling around KRW 1.5 trillion, which included employee performance bonuses, as well as escalating losses in its foundry and system chip divisions, each estimated at over KRW 1.5 trillion.
On the other hand, the company noted that its memory chip business performed better than anticipated, with an estimated profit of around KRW 7 trillion for the quarter, the Korean Economic Daily notes.
Memory business sales reached KRW 22.27 trillion, more than doubling from the previous year, driven by increased demand for high-end chips used in AI devices and servers, such as HBM, DDR5, and server SSDs, according to Samsung.
Key Takeaways for 2025 Outlook
In the fourth quarter, Samsung notes that while memory demand for mobile and PC may encounter softness, growth in AI will keep demand at robust levels.
As for the Foundry Business, Samsung claims that the unit successfully met its order targets — particularly in sub-5nm technologies — and released the 2nm GAA process design kit (PDK), enabling customers to proceed with their product designs. It notes that the Foundry Business will strive to acquire customers by improving the process maturity of its 2nm GAA technology.
Looking ahead to 2025, for DRAM, Samsung plans to expand the sales of HBM3E and the portion of high-end products such as DDR5 modules with 128GB density or higher for servers and LPDDR5X for mobile, PC, servers, and so on. For NAND, it will proactively respond to the high-density trend based on QLC products — including 64TB and 128TB SSDs — and solidify leadership in the PCIe Gen5 market by accelerating the tech migration from V6 to V8.
The Foundry Business, on the other hand, aims to expand revenue through ongoing yield improvements in advanced technology while securing major customers through successful 2nm mass production. In addition, integrating advanced nodes and packaging solutions to further develop the HBM buffer die is expected to help acquire new customers in the AI and HPC sectors, according to Samsung.
(Photo credit: Samsung)