Search Results
Keyword:Caron Ju,
22 result(s)
2023/11/27
TrendForce’s latest research into the HBM market indicates that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management Samsung’s HBM3 (24GB) is anticipated to complete verification with NVIDIA by December this year The progress of HBM3e, as outlined in the timeline below, shows that Micron provided its 8hi (24GB) samples to NVIDIA by the end of July, SK hynix in mid-August, and Samsung in early October
Given the intricacy of the HBM verification process—estimated to take two quarters—TrendForce expects that some manufacturers might learn preliminary HBM3e results by the end of 2023 However, it’s generally anticipated that major manufacturers will have definite results by 1Q24 Notably, the outcomes will influence NVIDIA’s procurement decisions for 2024, as final evaluations are still underway
NVIDIA continues to dominate the high-end chip market, expanding its lineup of advanced AI chips
As 2024 rolls around, a number of AI chip suppliers are pushing out their latest product offerings NVIDIA’s current high-end AI lineup for 2023, which utilizes HBM, includes models like the A100/A800 and H100/H800 In 2024, NVIDIA plans to refine its product portfolio further New additions will include the H200, using 6 HBM3e chips, and the B100, using 8 HBM3e chips NVIDIA will also integrate its own Arm-based CPUs and GPUs to launch the GH200 and GB200, enhancing its lineup with more specialized and powerful AI solutions
Contrastingly, AMD’s 2024 focus is on the MI300 series with HBM3, transitioning to HBM3e for the next-gen MI350 The company is expected to start HBM verification for MI350 in 2H24, with a significant product ramp-up projected for 1Q25
Intel Habana launched the Gaudi 2 in 2H22, which utilizes 6 HBM2e stacks Its upcoming Gaudi 3—slated for mid-2024 —is expected to continue using HBM2e but will be upgraded to 8 stacks TrendForce believes that NVIDIA, with its cutting-edge HBM specifications, product readiness, and strategic timeline, is poised to maintain a leading position in the GPU segment, and, by extension, in the competitive AI chip market
HBM4 may turn toward customization beyond commodity DRAM
HBM4 is expected to launch in 2026, with enhanced specifications and performance tailored to future products from NVIDIA and other CSPs Driven by a push toward higher speeds, HBM4 will mark the first use of a 12nm process wafer for its bottommost logic die (base die), to be supplied by foundries This advancement signifies a collaborative effort between foundries and memory suppliers for each HBM product, reflecting the evolving landscape of high-speed memory technology
With the push for higher computational performance, HBM4 is set to expand from the current 12-layer (12hi) to 16-layer (16hi) stacks, spurring demand for new hybrid bonding techniques HBM4 12hi products are set for a 2026 launch, with 16hi models following in 2027
Finally, TrendForce notes a significant shift toward customization demand in the HBM4 market Buyers are initiating custom specifications, moving beyond traditional layouts adjacent to the SoC, and exploring options like stacking HBM directly on top of the SoC While these possibilities are still being evaluated, TrendForce anticipated a more tailored approach for the future of the HBM industry
This move toward customization, as opposed to the standardized approach of commodity DRAM, is expected to bring about unique design and pricing strategies, marking a departure from traditional frameworks and heralding an era of specialized production in HBM technology
For more information on reports and market data from TrendForce’s Department of Semiconductor Research, please click here, or email the Sales Department at SR_MI@trendforcecom
For additional insights from TrendForce analysts on the latest tech industry news, trends, and forecasts, please visit https://wwwtrendforcecom/news/
TrendForce will launch our offline seminar on the 14th of December (Thursday) at Tokyo Bay Ariake Washington Hotel, during the time of SEMICON Japan This event will mainly focus on the dynamics and foresees in the 2024 tech market, including the memory, the semiconductor industry between Japan and Taiwan, foundry capacity change, and automotive and consumer electronics sectors, for the visitors’ attendance, please contact alanchen@trendforcecom or angelaliao@trendforcecom; the media attendance please contact estherfeng@trendforcecom
2023/10/13
TrendForce reports indicate a universal price increase for both DRAM and NAND Flash starting in the fourth quarter DRAM prices, for instance, are projected to see a quarterly surge of about 3-8% Whether this upward momentum can be sustained will hinge on the suppliers' steadfastness in maintaining production cuts and the degree of resurgence in actual demand, with the general-purpose server market being a critical determinant
PC DRAM: DDR5 prices, having already surged in the third quarter, are expected to maintain their upward trajectory, fueled by the stocking of new CPU models This forthcoming price hike cycle for both DDR4 and DDR5 is incentivizing PC OEMs to proceed with purchases Although manufacturers still have substantial inventory and there's no imminent shortage, Samsung has been nudged to further slash its production However, facing negative gross margins on DRAM products, most manufacturers are resistant to further price reductions, instead pushing for aggressive increases This stance sets the stage for an anticipated rise in DDR4 prices by 0–5% and DDR5 prices by around 3–8% in the fourth quarter Overall, as DDR5 adoption accelerates, an approximate 3–8% quarterly increase is projected for PC DRAM contract prices during this period
Server DRAM: Buyer inventory of DDR5 has climbed from 20% in Q2 to 30–35% recently However, with only 15% being actually utilized in servers in Q3, market uptake is slower than expected Meanwhile, Samsung's intensified production cutbacks have notably shrunk DDR4 wafer inputs, causing a supply crunch in server DDR4 stocks This scenario leaves no leeway for further server DDR4 price reductions In response, manufacturers, aiming to enhance profits, are accelerating DDR5 output
Looking ahead, Q4 forecasts anticipate stable server DDR4 average prices, while server DDR5 is set to maintain a declining trajectory With DDR5 shipments on the rise and a notable 50-60% price disparity with DDR4, the blended ASP for the range is poised for an upswing This leads to an estimated 3–8% quarterly hike in Q4 server DRAM contract prices
Mobile DRAM: Inventories have bounced back to healthy levels sooner than other sectors, thanks to price elasticity driving an increase in per-device capacity, and revitalizing purchasing enthusiasm in 2H23 On the other hand, although Q4 smartphone production hasn't reached the previous year's levels for the same period, a seasonal increase of over 10% is still supporting demand for mobile DRAM However, it's crucial to note that current manufacturer inventories remain high, and production cuts haven't yet altered the oversupply situation in the short term Nevertheless, manufacturers, under profit margin pressures, are insisting on pushing prices upward For products where inventory is more abundant, such as LPDDR4X or those from older manufacturing processes, the estimated contract price increase will be about 3–8% for the quarter In contrast, LPDDR5(X) appears to be in tighter supply, with projected contract price increases of 5–10%
Graphics DRAM: A niche market dynamic and an acceptance of price hikes among buyers suggest sustained procurement of mainstream GDDR6 16Gb chips, preparing for expected price increases in 2024 The launch of NVIDIA's new Server GPU L40s in the third quarter is facilitating the depletion of existing manufacturer inventories Furthermore, gaming notebooks are excelling in sales, surpassing the general notebook market this year Consequently, manufacturers are experiencing less inventory stress for graphics DRAM than they are for commodity DRAM This landscape sets the stage for an anticipated 3-8% hike in graphics DRAM contract prices for the fourth quarter
Consumer DRAM: Samsung initiated significant production reductions starting in September to diminish its surplus of older inventory These cuts are projected to hit 30% by the fourth quarter With the anticipation of steadily declining inventories, manufacturers are looking to increase consumer DRAM contract prices, aiming for hikes of more than 10%, to avoid incurring losses However, even though some producers raised their prices at September's close, demand continues to be lackluster, with purchasing and stock-up efforts not as strong as anticipated This deviation in pricing goes against the expected supply-demand balance, suggesting a more modest estimated rise of 3–8% in consumer DRAM contract prices for the fourth quarter—below manufacturers' initial targets
For more information on reports and market data from TrendForce’s Department of Semiconductor Research, please click here, or email the Sales Department at SR_MI@trendforcecom
For additional insights from TrendForce analysts on the latest tech industry news, trends, and forecasts, please visit our blog at https://wwwtrendforcecom/news/
2023/08/30
TrendForce expects that memory suppliers will continue their strategy of scaling back production of both DRAM and NAND Flash in 2024, with the cutback being particularly pronounced in the financially struggling NAND Flash sector Market demand visibility for consumer electronic is projected to remain uncertain in 1H24 Additionally, capital expenditure for general-purpose servers is expected to be weakened due to competition from AI servers Considering the low baseline set in 2023 and the current low pricing for some memory products, TrendForce anticipates YoY bit demand growth rates for DRAM and NAND Flash to be 13% and 16%, respectively Nonetheless, achieving effective inventory reduction and restoring supply-demand balance next year will largely hinge on suppliers’ ability to exercise restraint in their production capacities If managed effectively, this could open up an opportunity for a rebound in average memory prices
PC: The annual growth rate for average DRAM capacity is projected at approximately 124%, driven mainly by Intel’s new Meteor Lake CPUs coming into mass production in 2024 This platform’s DDR5 and LPDDR5 exclusivity will likely make DDR5 the new mainstream, surpassing DDR4 in the latter half of 2024 The growth rate in PC client SSDs will not be as robust as that of PC DRAM, with just an estimated growth of 8–10% As consumer behavior increasingly shifts toward cloud-based solutions, the demand for laptops with large storage capacities is decreasing Even though 1 TB models are becoming more available, 512 GB remains the predominant storage option Furthermore, memory suppliers are maintaining price stability by significantly reducing production Should prices hit rock bottom and subsequently rebound, PC OEMs are expected to face elevated SSD costs This, when combined with Windows increasing its licensing fees for storage capacities at and above 1 TB, is likely to put a damper on further growth in average storage capacities
Servers: The annual growth rate for the average capacity of server DRAM is estimated to reach 173% This surge is primarily fueled by generational transitions in server platforms, a heightened dependency on RAM that is coordinated with CPU cores in specific CSP operations, and the high computational load requirements of AI servers Enterprise SSDs: The estimated annual growth rate of average capacity stands at 147% For CSPs, the rollout of processors supporting PCle 50 suggests that inventory levels for these OEMs should return to normal by early next year, likely prompting an uptick in the procurement of 8 TB products On the brand side for servers, while the sharp decline in NAND Flash prices is propelling an increase in 16 TB deployments, the contribution of AI servers to this growth remains relatively limited
Smartphones: The annual production growth rate for smartphones in 2024 is estimated to be a modest 22%, largely impacted by a global economic downturn TrendForce cites this as the primary reason for the sluggish growth in demand volume Thanks to multiple consecutive quarters of declining memory prices, brands are increasingly focusing on hardware competition Consequently, the average DRAM memory capacity in smartphones is projected to rise by approximately 143% in 2023 Given that the ASP of mobile DRAM is expected to remain at a relatively low point in 2024, this trend is likely to persist, potentially leading to an additional 79% growth in average device memory capacity for the year
UFS & eMMC: Factors such as growing demand for image storage and increased 5G penetration are expected to drive up the average storage capacity in smartphones However, as suppliers scale back production to set the stage for potential price hikes, smartphone OEMs are projected to exercise greater caution in cost management in 2024, potentially leading to fewer mid-to-low-end models offering over 1TB of storage On a related note, the absence of enthusiasm for QLC products among smartphone OEMs likely presents an obstacle for suppliers aiming to nudge consumers toward capacity upgrades via lower-cost options With smartphone storage baselines continually rising and given Apple's current lack of plans to introduce iPhone models exceeding 1TB of storage, TrendForce anticipates that the annual growth rate for average smartphone storage capacity will hover around 13% in 2024
For more information on reports and market data from TrendForce’s Department of Semiconductor Research, please click here, or email Ms Latte Chung from the Sales Department at lattechung@trendforcecom
For additional insights from TrendForce analysts on the latest tech industry news, trends, and forecasts, please visit our blog at https://insidertrendforcecom/
2023/08/09
TrendForce highlights in its latest report that memory suppliers are boosting their production capacity in response to escalating orders from NVIDIA and CSPs for their in-house designed chips These efforts include the expansion of TSV production lines to increase HBM output Forecasts based on current production plans from suppliers indicate a remarkable 105% annual increase in HBM bit supply by 2024 However, due to the time required for TSV expansion, which encompasses equipment delivery and testing (9 to 12 months), the majority of HBM capacity is expected to materialize by 2Q24
TrendForce analysis indicates that 2023 to 2024 will be pivotal years for AI development, triggering substantial demand for AI Training chips and thereby boosting HBM utilization However, as the focus pivots to Inference, the annual growth rate for AI Training chips and HBM is expected to taper off slightly The imminent boom in HBM production has presented suppliers with a difficult situation: they will need to strike a balance between meeting customer demand to expand market share and avoiding a surplus due to overproduction Another concern is the potential risk of overbooking, as buyers, anticipating an HBM shortage, might inflate their demand
HBM3 is slated to considerably elevate HBM revenue in 2024 with its superior ASP
The HBM market in 2022 was marked by sufficient supply However, an explosive surge in AI demand in 2023 has prompted clients to place advance orders, stretching suppliers to their capacity limits Looking ahead to 2024, TrendForce forecasts that due to aggressive expansion by suppliers, the HBM sufficiency ratio will rise from -24% to 06% in 2024
TrendForce believes that the primary demand is shifting from HBM2e to HBM3 in 2023, with an anticipated demand ratio of approximately 50% and 39%, respectively As an increasing number of chips adopting HBM3 hit the market, demand in 2024 will heavily lean toward HBM3 and eclipse HBM2e with a projected share of 60% This expected surge—coupled with a higher ASP—is likely to trigger a significant increase in HBM revenue next year
SK hynix currently holds the lead in HBM3 production, serving as the principal supplier for NVIDIA’s server GPUs Samsung, on the other hand, is focusing on satisfying orders from other CSPs The gap in market share between Samsung and SK hynix is expected to narrow significantly this year due to an increasing number of orders for Samsung from CSPs Both firms are predicted to command similar shares in the HBM market sometime between 2023 to 2024—collectively occupying around 95% However, variations in shipment performance may arise across different quarters due to their distinct customer bases Micron, which has focused mainly on HBM3e development, may witness a slight decrease in market share in the next two years due to the aggressive expansion plans of these two South Korean manufacturers
Prices for older HBM generations are expected to drop in 2024, while HBM3 prices may remain steady
From a long-term perspective, TrendForce notes that the ASP of HBM products gradually decreases year on year Given HBM’s high-profit nature and a unit price far exceeding other types of DRAM products, suppliers aim to incrementally reduce prices to stimulate customer demand, leading to a price decline for HBM2e and HBM2 in 2023
Even though suppliers have yet to finalize their pricing strategies for 2024, TrendForce doesn’t rule out the possibility of further price reductions for HBM2 and HBM2e products, given a significant improvement in the overall HBM supply and suppliers’ endeavors to broaden their market shares Meanwhile, HBM3 prices are forecast to stay consistent with 2023 Owing to its significantly higher ASP compared to HBM2e and HBM2, HBM3 is poised to bolster suppliers’ HBM revenue, potentially propelling total HBM revenue to a whopping US$89 billion in 2024—a 127% YoY increase
For more information on reports and market data from TrendForce’s Department of Semiconductor Research, please click here, or email Ms Latte Chung from the Sales Department at lattechung@trendforcecom
For additional insights from TrendForce analysts on the latest tech industry news, trends, and forecasts, please visit our blog at https://insidertrendforcecom/
2023/08/01
TrendForce reports that the HBM (High Bandwidth Memory) market's dominant product for 2023 is HBM2e, employed by the NVIDIA A100/A800, AMD MI200, and most CSPs' (Cloud Service Providers) self-developed accelerator chips As the demand for AI accelerator chips evolves, manufacturers plan to introduce new HBM3e products in 2024, with HBM3 and HBM3e expected to become mainstream in the market next year
The distinctions between HBM generations primarily lie in their speed The industry experienced a proliferation of confusing names when transitioning to the HBM3 generation TrendForce clarifies that the so-called HBM3 in the current market should be subdivided into two categories based on speed One category includes HBM3 running at speeds between 56 to 64 Gbps, while the other features the 8 Gbps HBM3e, which also goes by several names including HBM3P, HBM3A, HBM3+, and HBM3 Gen2
The development status of HBM by the three major manufacturers, SK hynix, Samsung, and Micron, varies SK hynix and Samsung began their efforts with HBM3, which is used in NVIDIA's H100/H800 and AMD's MI300 series products These two manufacturers are also expected to sample HBM3e in Q1 2024 Meanwhile, Micron chose to skip HBM3 and directly develop HBM3e
HBM3e will be stacked with 24Gb mono dies, and under the 8-layer (8Hi) foundation, the capacity of a single HBM3e will jump to 24GB This is anticipated to feature in NVIDIA’s GB100, set to launch in 2025 Hence, major manufacturers are expected to release HBM3e samples in Q1 2024 and aim to mass-produce them by 2H 2024
CSPs are developing their own AI chips in an effort to reduce dependency on NVIDIA and AMD
NVIDIA continues to command the highest market share when it comes to AI server accelerator chips However, the high costs associated with NVIDIA’s H100/H800 GPUs, priced at between US$20,000 and $25,000 per unit, coupled with an AI server’s recommended eight-card configuration, have dramatically increased the total cost of ownership Therefore, while CSPs will continue to source server GPUs from NVIDIA or AMD, they are concurrently planning to develop their own AI accelerator chips
Tech giants Google and Amazon Web Services (AWS) have already made significant strides in this area with the establishment of the Google Tensor Processing Unit (TPU) and AWS’ Trainium and Inferentia chips Furthermore, these two industry leaders are already hard at work on their next-generation AI accelerators, which are set to utilize HBM3 or HBM3e technology Furthermore, other CSPs in North America and China are also conducting related verifications, signaling a potential surge in competition in the AI accelerator chip market in the coming years
For more information on reports and market data from TrendForce’s Department of Semiconductor Research, please click here, or email Ms Latte Chung from the Sales Department at lattechung@trendforcecom
For additional insights from TrendForce analysts on the latest tech industry news, trends, and forecasts, please visit our blog at https://insidertrendforcecom/