News
From the current landscape of publicly available DRAM technologies, the industry is expected to perceive 3D DRAM as one of the solutions to the challenges faced by DRAM technology, marking it as a pivotal direction for the future memory market.
Is 3D DRAM similar to 3D NAND? How will the industry address technological bottlenecks such as size limitations? What are the strategies of major players in the field?
The circuitry of DRAM consists of a transistor and a capacitor, where the transistor is responsible for transmitting electrical currents to write or read information (bits), while the capacitor stores the bits.
DRAM finds wide application in modern digital electronic devices such as computers, graphics cards, portable devices, and gaming consoles, due to its low cost and high capacity memory.
The development of DRAM primarily focuses on increasing integration by reducing circuit line widths. However, as line widths reach the 10nm range, physical limitations such as capacitor current leakage and interference significantly increase.
To address these issues, the industry has introduced new materials and equipment like high dielectric constant (high-K) deposition materials and Extreme Ultraviolet (EUV) devices.
Nevertheless, from the perspective of chip manufacturers, miniaturizing the manufacturing of 10nm or more advanced chips remains a significant challenge in current technology research and development. Additionally, the competition for advanced processes, particularly at 2nm and below, has intensified recently.
In an era marked by continuous technological advancements, the semiconductor industry has turned its attention to the evolution of NAND technology. To overcome scaling limitations, transistors are transitioning from a planar to a 3D architecture, increasing the number of storage units per unit area. This concept of 3D DRAM architecture has entered the public sphere.
In traditional DRAM, transistors are integrated on a flat plane. However, in 3D DRAM, transistors are stacked into multiple layers, thereby dispersing the transistors. It is believed that adopting a 3D DRAM structure can widen the gaps between transistors, reducing leakage currents and interference.
From a theoretical perspective, 3D DRAM technology breaks the conventional paradigm of memory technology. It is a novel storage method that stacks storage cells above logic units, enabling higher capacities within a unit chip area.
In terms of differentiation, traditional DRAM requires complex operational processes for reading and writing data, whereas 3D DRAM can directly access and write data through vertically stacked storage units, significantly enhancing access speeds. The advantages of 3D DRAM not only include high capacity and fast data access but also low power consumption and high reliability, meeting various application needs.
In terms of application areas, the high speed and large capacity of 3D DRAM will help improve the efficiency and performance of high-performance computing. The compact size and large capacity of 3D DRAM make it an ideal memory solution for mobile devices. The large capacity and low power consumption characteristics of 3D DRAM can meet the real-time data processing and transmission requirements of the Internet of Things (IoT) field.
Furthermore, since the advent of the AI era with ChatGPT, AI applications have surged, and AI servers are expected to become a strong driving force for the long-term growth in storage demand.
Micron’s chief business officer previously stated in an interview with Reuter that a typical AI server has up to eight times the amount of DRAM and three times the amount of NAND that a normal server has.
The DRAM market remains highly concentrated, currently dominated by key players such as Samsung Electronics, SK Hynix, and Micron Technology, collectively holding over 93% of the entire market share.
According to a report from TrendForce, as of the third quarter of 2023, Samsung leads the global market with a share of 38.9%, followed by SK Hynix (34.3%) and Micron Technology (22.8%).
Currently, 3D DRAM is in its early stages of development, with companies like Samsung actively joining the research and development battleground. The competition is intense as various players strive to lead in this rapidly growing market.
Since 2019, Samsung has been conducting research on 3D DRAM and announced the industry’s first 12-layer 3D-TSV (Through-Silicon Via) technology in October of the same year. In 2021, Samsung established a next-generation process development research team within its DS division, focusing on research in this field.
At the 2022 SAFE Forum, Samsung outlined the overall 3DIC journey of Samsung Foundry and indicated its readiness to address DRAM stacking issues with a logic-stacked chip, SAINT-D. The design aims to integrate eight HBM3 chips onto one massive interposer chip.
In May 2023, as per sources cited by “The Elec,” Samsung Electronics formed a development team within its semiconductor research center to mass-produce 4F2 structured DRAM.
The goal is reportedly to apply 4F2 to DRAM at 10nm processes or more advanced nodes, as DRAM cell scaling has reached its limit. The report suggests that if Samsung’s 4F2 DRAM storage unit structure research is successful, the chip die area can be reduced by around 30% compared to existing 6F2 DRAM storage unit structures without changing the node.
In October of the same year, at the “Memory Technology Day” event, Samsung Electronics announced its plans to introduce a new 3D structure in the next-generation 10-nanometer more advanced nodes DRAM, rather than the existing 2D planar structure. The aim of this project is to increase the production capacity of a chip by over 100G.
At the “VLSI Symposium” held in Japan last year, Samsung Electronics presented a paper containing research results on 3D DRAM and showcased detailed images of 3D DRAM as an actual semiconductor implementation.
According to a report by The Economic Times, Samsung Electronics recently announced the opening of a new R&D laboratory in Silicon Valley, USA, dedicated to the development of next-generation 3D DRAM.
The laboratory is operated under Silicon Valley’s Device Solutions America (DSA) and is responsible for overseeing Samsung’s semiconductor production in the United States, as well as focusing on the development of new generations of DRAM products.
Per SK Hynix’s research, the IGZO channel is attracting attention to improve the refresh characteristics of DRAM.
Reportedly, IGZO thin film transistors have been used in the display industry for a long time due to their moderate carrier mobility, extremely low leakage current and substrate size scalability. It can be a candidate for a stackable channel material for future DRAM.
NEO Semiconductor, a US memory technology company, introduces its groundbreaking technology, 3D X-DRAM, aimed at overcoming the capacity limitations of DRAM.
3D X-DRAM features the first-ever array structure of DRAM units based on Floating Body Cell (FBC) technology, akin to 3D NAND. Similar to 3D NAND Flash, its logic involves stacking layers to increase memory capacity. The FBC technology in 3D NAND Flash enables the formation of a vertical structure with the addition of a layer mask, offering high yield, low cost, and a significant density boost.
According to Neo’s estimates, the 3D X-DRAM technology can achieve a density of 128 Gb across 230 layers, which is eight times the current density of DRAM. NEO proposes a target of an eightfold capacity increase every decade, aiming to achieve a capacity of 1Tb between 2030 and 2035, representing a 64-fold increase compared to the current core capacity of DRAM.
This expansion is intended to meet the growing demand for high-performance and large-capacity semiconductor storage, especially for AI applications like ChatGPT.
“3D X-DRAM will be the absolute future growth driver for the Semiconductor industry,” said Andy Hsu, Founder and CEO of NEO Semiconductor.
A research team at the Tokyo Institute of Technology in Japan has introduced a groundbreaking 3D DRAM stacking design technology called BBCube, which enables superior integration between processing units and DRAM.
The most significant aspect of BBCube 3D lies in achieving a three-dimensional connection between processing units and DRAM instead of the traditional two-dimensional linkages. The team employs an innovative stacking structure while using an innovative stacked structure in which the PU dies sit atop multiple layers of DRAM, all interconnected via through-silicon vias (TSVs).
The overall structure of BBCube 3D is compact, devoid of typical solder microbumps, and utilizes TSVs instead of longer wires, collectively contributing to achieving low parasitic capacitance and low resistance, thereby enhancing the electrical performance of the device in various aspects.
The research team evaluated the speed of the new architecture and compared it with two of the most advanced memory technologies, DDR5 and HBM2E. Researchers claim that BBCube 3D could potentially achieve a bandwidth of 1.6 terabytes per second, which is 30 times higher than DDR5 and 4 times higher than HBM2E.
Furthermore, due to features like low thermal resistance and low impedance in BBCube, potential thermal management and power issues associated with 3D integration could be mitigated. The new technology significantly improves bandwidth while consuming only 1/20 and 1/5 of the bit access energy compared to DDR5 and HBM2E, respectively.
The evolution of DRAM technology from 1D to 2D and now to the diverse structures of 3D has offered the industry various solutions to address its challenges. However, optimizing and improving manufacturing costs, durability, and reliability remain significant challenges in advancing 3D DRAM technology. Due to the difficulties in developing new materials and physical limitations, the commercialization of 3D DRAM still requires some time.
Based on the current research progress, the industry is actively engaged in the development of 3D DRAM, which are still in the early stages. According to industry insiders, it is predicted that 3D DRAM will begin to emerge around 2025, with actual mass production becoming feasible after 2030.
Read more
(Photo credit: Samsung)
News
Amid the AI trend, the significance of high-value-added DRAM represented by HBM continues to grow.
HBM (High Bandwidth Memory) is a type of graphics DDR memory that boasts advantages such as high bandwidth, high capacity, low latency, and low power consumption compared to traditional DRAM chips. It accelerates AI data processing speed and is particularly suitable for high-performance computing scenarios like ChatGPT, making it highly valued by memory giants in recent years.
Memory is also representing one of Korea’s pillar industries, and to seize the AI opportunity and drive the development of the memory industry, Korea has recently designated HBM as a national strategic technology.
The country will provide tax incentives to companies like Samsung Electronics. Small and medium-sized enterprises in Korea can enjoy up to a 40% to 50% reduction, while large enterprises like Samsung Electronics can benefit from a reduction of up to 30% to 40%.
Overview of HBM Development Progress Among Top Manufacturers
The HBM market is currently dominated by three major storage giants: Samsung, SK Hynix, and Micron. Since the introduction of the first silicon interposer HBM product in 2014, HBM technology has smoothly transitioned from HBM, HBM2, and HBM2E to HBM3 and HBM3e through iterative innovation.
According to research by TrendForce, the mainstream HBM in the market in 2023 is HBM2e. This includes specifications used in NVIDIA A100/A800, AMD MI200, and most CSPs’ self-developed acceleration chips. To meet the evolving demands of AI accelerator chips, various manufacturers are planning to launch new products like HBM3e in 2024, expecting HBM3 and HBM3e to become the market norm.
The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October.
As for the higher-spec HBM4, TrendForce expects its potential launch in 2026. With the push for higher computational performance, HBM4 is set to expand from the current 12-layer (12hi) to 16-layer (16hi) stacks, spurring demand for new hybrid bonding techniques. HBM4 12hi products are set for a 2026 launch, with 16hi models following in 2027.
Meeting Demand, Manufacturers Actively Expand HBM Production
As companies like NVIDIA and AMD continue to introduce high-performance GPU products, the three major manufacturers are actively planning the mass production of HBM with corresponding specifications.
Previously, media reports highlighted Samsung’s efforts to expand HBM production capacity by acquiring certain buildings and equipment within the Samsung Display’s Cheonan facility.
Samsung plans to establish a new packaging line at the Cheonan plant dedicated to large-scale HBM production. The company has already invested KRW 10.5 trillion in the acquisition of the mentioned assets and equipment, with an additional investment of KRW 700 billion to KRW 1 trillion.
Micron Technology’s Taichung Fab 4 in Taiwan was officially inaugurated in early November 2023. Micron stated that Taichung Fab 4 would integrate advanced probing and packaging testing functions to mass-produce HBM3e and other products, thereby meeting the increasing demand for various applications such as artificial intelligence, data centers, edge computing, and the cloud. The company plans to start shipping HBM3e in early 2024.
In its latest financial report, SK Hynix stated that in the DRAM sector in 2023, its main products DDR5 DRAM and HBM3 experienced revenue growth of over fourfold and fivefold, respectively, compared to the previous year.
At the same time, in response to the growing demand for high-performance DRAM, SK Hynix will smoothly carry out the mass production of HBM3e for AI applications and the research and development of HBM4.
Read more
(Photo credit: SK Hynix)
News
South Korean memory giant SK Hynix has released its financial results for the fourth quarter of 2023 and the full year 2023, ending on December 31, 2023. In the fourth quarter, the revenue reached KRW 11.306 trillion, operating profit amounted to KRW 346 billion, and net loss was KRW 1.38 trillion. The operating profit margin for Q4 2023 was 3%, with a net profit margin of negative 12%.
SK Hynix noted that, with the rebound in the memory market, the operating profit for the fourth quarter of 2023 reached KRW 346 billion, successfully marking a turnaround from losses. This signifies that SK Hynix, in just one year, has managed to break free from the continuous operating losses experienced since the fourth quarter of 2022.
SK Hynix emphasizes that the overall memory market conditions improved in the last quarter of 2023 with demand for AI server and mobile applications increasing and average selling price (ASP) rising.
Simultaneously, the effective implementation of a profit-oriented business plan by SK Hynix has enabled the company to achieve the goal of turning losses into profits within just one year.
Furthermore, SK Hynix has reduced the cumulative scale of operating losses that persisted until Q3 2023. In total, the consolidated revenue for 2023 reached KRW 32.766 trillion, with an operating loss of KRW 7.73 trillion and a net loss of KRW 9.138 trillion. Overall, the operating loss rate for 2023 is 24%, and the net loss rate is 27%.
SK Hynix also notes that in the DRAM sector for 2023, the company actively addressed customer demands. The revenue for the company’s flagship products, DDR5 and HBM3, increased by more than four and five times, respectively, compared to 2022.
Additionally, considering the relatively slow recovery in the NAND Flash memory market, the business plan primarily focuses on investment and cost efficiency.
In response to the growing trend in demand for high-performance DRAM, SK Hynix will smoothly proceed with the mass production of HBM3e memory for AI and the development of HBM4.
TrendForce’s earlier research into the HBM market indicates that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management. The progress of HBM3e, as outlined in the timeline below, shows that SK Hynix provided its 8hi (24GB) samples to NVIDIA in mid-August.
Simultaneously, the company aims to supply high-performance and high-capacity products like DDR5 and LPDDR5T to the server and mobile markets.
Moreover, to address the continued growth in demand for AI servers and the widespread adoption of edge AI computing applications, SK Hynix will exert efforts in the development of high-capacity server module MCR DIMM and mobile module LPCAMM2 to respond to the ever-increasing demand for AI servers and on-device AI adoption.
For NAND, the company aims to continue to improve profitability and stabilize the business by expanding sales of premium products such as eSSD, expected to improve profitability and strengthen internal management.
Lastly, SK Hynix emphasizes its commitment to maintaining and enhancing profitability and efficiency by continuing to expand the production of high-value-added products in 2024, similar to its strategy in 2023. The company will focus on minimizing capital expenditures while prioritizing stable business operations.
“We achieved a remarkable turnaround, marking the first operating profit in the fourth quarter following a protracted downturn, thanks to our technological leadership in the AI memory space,” said Kim Woohyun, Vice President and Chief Financial Officer (CFO) at SK Hynix.
Kim further stated, “We are now ready to grow into a total AI memory provider by leading changes and presenting customized solutions as we enter an era for a new leap forward.”
Read more
(Photo credit: SK Hynix)
News
Pei-Ing Lee, the General Manager of Nanya Technology, a major DRAM manufacturer, mentioned on January 10th that this year has seen an upward trend in DRAM prices
According to Economic Daily News citnig from Nanya Technology’s earnings call for 23Q4, this trend is attributed to the resurgence of the smartphone market, increased demand fueled by AI, and the three major memory manufacturers pivoting towards DDR5 production. This shift is advantageous for depleting DDR4 inventory and could potentially result in a supply shortage.
Having endured over a year of downturn in the memory market, Lee expressed an optimistic outlook by stating that “there is a possibility of future supply shortages,” revealing an overall positive trajectory for the DRAM market.
Lee acknowledged that the DRAM market faced challenges last year, resulting in stagnant bit sales for Nanya Technology. However, he anticipates a better scenario this year, noting the upward trend in DDR4 pricing. The timing for DDR3 price increases is expected to follow but at a slower pace. Lee further stated that DDR3 constituted about 40% of Nanya Technology’s revenue in the past, but it is expected to decrease, with DDR4’s share rising.
Due to major international players focusing on High-Bandwidth Memory (HBM) and DDR5, he anticipates a potential supply shortage for DDR4 this year.
Lee pointed out that the growth in AI demand is positively impacting the DRAM market. The shift from high-end HBM and DDR4 to DDR5 is influencing demand, showing improvement quarter by quarter.
Regarding pricing trends, he confirmed a rebound in prices in the fourth quarter of 2023 and expressed optimism for a gradual upward trend in 2024. However, Lee cautioned that external variables such as geopolitical tensions, the war in Europe, and the U.S.-China trade dispute could still impact the market’s recovery momentum.
In terms of demand, Lee highlighted four key points. Firstly, server demand is driven by AI servers, with a focus on observing IT spending by U.S. cloud companies. Secondly, the introduction of new smartphones, leading to an increase in average DRAM capacity, especially in AI smartphones boosting the high-end smartphone market. Presently, improving smartphone sales in China are observed, and the recovery momentum of the Chinese economy is crucial.
In the PC application sector, Lee mentioned that inventory is gradually returning to normal levels, and AI PCs will simultaneously boost the high-end PC market. As for consumer electronic terminal products, demand for IP cameras, networking, industrial control, and automotive applications is relatively healthy, with consumer electronic products expected to show stable growth in 2024.
In terms of technological advancements, Nanya Technology aims to begin small-scale production of DDR5 products at the end of the third quarter of this year. Initially applied in servers and partly in PCs, the first product is expected to achieve a bandwidth of 5600MHz, while the second product is currently in the design phase, with an estimated bandwidth of 6400MHz.
Lee explained that their second DDR5 product will utilize third-generation processes, aiming to further improve cost structures, increase speed, achieve a target of 6400 MHz, and possess the capability for high density and 3D IC technology.
Read more
(Photo credit: Nanya Technology)
News
SK Hynix CEO Kwak Noh-Jung expressed optimism at the Consumer Electronics Show (CES) in the United States, stating that artificial intelligence (AI) chips would propel SK Hynix’s market value to double within three years, reaching KRW 200 trillion (approximately USD 152 billion).
Kwak also revealed plans to adjust the DRAM production reduction policy in the first quarter, while anticipating changes in NAND Flash production strategy in the latter half of the year.
At the CES exhibition in Las Vegas, Kwak emphasized that generative AI is gradually becoming widespread, and memories are increasingly crucial. With the advancement of AI systems, customer demands for memory will become more diverse. Kwak highlighted the development of a platform to offer customized options for various customers.
“If we prepare the products we are currently producing well, pay attention to maximising investment efficiency and maintaining financial soundness, I think we can attempt to double the current market capitalisation of 100 trillion won to 200 trillion won within three years,” Kwak said.
Kwak further stated in the CES: “There are only three HBM providers in the market. What I can say for sure is that SK Hynix is a clear leader in the HBM space.”
For the current HBM market, as reported by TrendForce earlier, SK hynix holds the lead in HBM3 production, serving as the principal supplier for NVIDIA’s server GPUs.
Samsung, on the other hand, is focusing on satisfying orders from other CSPs. The gap in market share between Samsung and SK hynix is expected to narrow significantly in 2023 due to an increasing number of orders for Samsung from CSPs. Both firms are predicted to command similar shares in the HBM market sometime between 2023 to 2024—collectively occupying around 95%.
Meanwhile, when asked if SK Hynix would ease its current chip production reduction policy, Kwak responded that the company’s policies are flexible and will be adjusted based on different product categories.
He mentioned that SK Hynix might change its DRAM production reduction policy in the first quarter, while adjustments for NAND Flash are anticipated to take place in the latter half of the year.
Read more
(Photo credit: SK Hynix)