DRAM


2024-08-07

[Insights] Memory Spot Price Update: DRAM Spot Trading Remains Limited as DDR4 Spot Prices Continue to Fall

According to TrendForce’s latest memory spot price trend report, neither did the DRAM nor NAND spot prices sees much momentum. DDR5 products are relatively stable, while the spot prices of DDR4 products continue to fall gradually due to high inventory levels. As for NAND flash, the spot market saw no apparent changes from last week at a restricted level of transactions also due to sufficient inventory. Details are as follows:

DRAM Spot Price:

The market has not shown notable changes in terms of momentum, and spot prices of DDR5 products are relatively stable. As for DDR4 products, spot prices continue to fall gradually due to high inventory levels. Overall, spot trading is quite limited in terms of volume due to the constraint imposed by weak consumer demand. The average spot price of the mainstream chips (i.e., DDR4 1Gx8 2666MT/s) dropped by 0.10% from US$1.991 last week to US$1.989 this week.

NAND Flash Spot Price:

The spot market saw no apparent changes from last week at a restricted level of transactions also due to sufficient inventory. Spot prices of 512Gb TLC wafers have risen by 1.17% this week, arriving at US$3.291.

 

 

2024-08-07

[News] Market Rumors Suggest Samsung’s HBM3e Passed NVIDIA Test, Though Samsung Denies

According to a report from Reuters citing industry sources, Samsung Electronics’ fifth-generation high-bandwidth memory (HBM3e) have passed tests by NVIDIA and could be used in NVIDIA’s AI processors.

The report further indicates that while no supply contract has been signed yet, one is expected soon, with potential deliveries starting in the fourth quarter of this year. The news also notes that the tested HBM3e chips are 8-layer, while Samsung’s 12-layer HBM3e have yet passed test.

However, in response to the matter, Samsung Electronics stated in a report from BusinessKorea on August 7 that they could not confirm stories related to their customers and that the report was not true.

The Samsung Electronics official cited by BusinessKorea also mentioned that, as previously stated during a conference call last month, the quality testing is still ongoing and there have been no updates since then.

Samsung had been working since last year to become a supplier of NVIDIA’s HBM3 and HBM3e. In late July, it is said that Samsung’s HBM3 has passed NVIDIA’s qualification, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Reuters and BusinessKorea.
2024-08-07

[News] SK hynix Secures up to USD 450 Million Funding for Indiana Packaging Facility under CHIPS Act

SK hynix, the current High Bandwidth Memory (HBM) market leader, announced on August 6th that it has signed a non-binding preliminary memorandum of terms with the U.S. Department of Commerce to receive up to USD 450 million in proposed direct funding and access to proposed loans of USD 500 million as part of the CHIPS and Science Act. The funding, according to its press release, will be used to build a production base for semiconductor packaging in Indiana.

Earlier in April, the other two memory giants, Samsung and Micron, have secured funds under the CHIPS and Science Act as well, receiving USD 6.4 billion and USD 6.1 billion, respectively.

SK hynix also noted in its press release that it plans to seek from the U.S. Department of the Treasury a tax benefit equivalent of up to 25% of the qualified capital expenditures through the Investment Tax Credit program.

The South Korean memory chip maker also said that it will proceed with the construction of the Indiana production base as planned to provide AI memory products. Through this, it looks forward to contributing to build a more resilient supply chain of the global semiconductor industry.

The signing follows SK hynix’s announcement in April that it plans to invest USD 3.87 billion to build a production base for advanced packaging in Indiana in a move expected to create around 1,000 jobs. According to a previous report by The Wall Street Journal, the advanced packaging fab it is expected to commence operations by 2028.

As the major HBM supplier of AI giant NVIDIA, SK hynix has good reason to accelerate the pace of capacity expansion. The recent NVIDIA Blackwell B200, with each GPU utilizing 8 HBM3e chips, has also underscored SK hynix’s role in the critical components supply chain for the AI industry.

On the other hand, a week earlier, semiconductor equipment leader Applied Materials was said to be rejected for funding under the CHIPS act for a R&D center in Silicon Valley, which targets to develop next-generation chipmaking tools. It has tried to gain U.S. funding for a USD 4 billion facility in Sunnyvale, California, which was slated to be completed in 2026.

Read more

(Photo credit: SK hynix)

Please note that this article cites information from SK hynix and The Wall Street Journal.
2024-08-07

[News] Chinese Tech Giants Reportedly Stockpile Samsung’s HBM ahead of Potential U.S. Restrictions

With the chip war between the two great powers heating up, the U.S. is reportedly mulling new measures to limit China’s access to AI memory. As the restrictions might be imposed as early as late August, rumor has it that Chinese tech giants like Huawei and Baidu, along with other startups, are stockpiling high bandwidth memory (HBM) semiconductors from Samsung Electronics, according to the latest report by Reuters.

Citing sources familiar with the matter, the report notes that these companies have increased their purchases of AI-capable semiconductors since early this year. One source states that in accordance with this trend, China contributed to around 30% of Samsung’s HBM revenue in 1H24.

Regarding the details of the potential restrictions, sources cited by Reuters said that the U.S. authority is anticipated to establish guidelines for restricting access to HBM chips. While the U.S. Department of Commerce declined to comment, it did state last week that the government is continually evaluating the evolving threat landscape and updating export controls.

The Big Three in the memory sector, Samsung, SK hynix and Micron, are all working on their 4th generation (HBM3) and 5th generation (HBM3e) products, while closely cooperating with AI giants such as NVIDIA and AMD in developing AI accelerators.

Reuters notes that the surging HBM demand from China recently has primarily focused on HBM2e, which is two generations behind HBM3e. However, as the capacities of other manufacturers are already fully booked by other American AI companies, China has turned to Samsung for its HBM demand.

Sources cited by Reuters also indicate that a wide range of businesses, from satellite manufacturers to tech firms like Tencent, have been purchasing these HBM chips. Meanwhile, Huawei has been using Samsung HBM2e to produce its advanced Ascend AI chip, according to one of the sources. It is also reported that Chinese memory giant ChangXin Memory Technologies (CXMT) has started mass production of HBM2.

Samsung and SK hynix declined to comment, neither did Micron, Baidu, Huawei and Tencent respond to requests for comment, Reuters notes.

Read more

(Photo credit: Samsung)

Please note that this article cites information from Reuters.
2024-08-06

[News] Samsung to Benefit from NVIDIA’s Reported Blackwell Delay as Tech Giants Turn to AMD

As AI giant NVIDIA is said to delay its upcoming Blackwell series chips for months, which are now expected to hit the market around early 2025, the related semiconductor supply chain is experiencing a reshuffle. According to a report by the Korea Economic Daily, Samsung Electronics, which is eager to expand its market share for HBM3 and HBM3e, is likely to emerge as a major beneficiary in addition to AMD.

In March, NVIDIA introduced the Blackwell series, claiming it could enable customers to build and operate real-time generative AI on trillion-parameter large language models at up to 25 times less cost and energy consumption compared to its predecessor.

However, according to The Information, last week, NVIDIA informed major customers, including Google and Microsoft, that the shipments of its Blackwell AI accelerator would be delayed by at least three months due to design flaws.

Blackwell Delayed Potentially due to Design Flaws and TSMC’s Capacity Constraints

Tech media Overclocking points out that the defect is related to the part connecting the two GPUs, and creates problems for NVIDIA’s dual GPU versions, including the B200 and the GB200.

The delay has prompted tech companies to look for alternatives from NVIDIA’s competitors, such as AMD, according to the Korea Economic Daily. Microsoft and Google have already been working on next-generation products with AMD. For instance, Microsoft has purchased the MI300X, an AI accelerator from the US fabless semiconductor designer, the report says.

Samsung to Benefit thanks to the Collaboration with AMD

Samsung, as its HBM3 received AMD MI300 series certification in 1Q24, and is likely to provide HBM3e chips to AMD afterwards, is expected to benefit. Citing a semiconductor industry source, the Korea Economic Daily notes that as it is very risky for a single company to dominate the AI chip supply chain, the situation will create opportunities for Samsung and AMD.

It is also worth noting that Samsung’s HBM3 has passed NVIDIA’s qualification earlier, and would be used in the AI giant’s H20, which has been developed for the Chinese market in compliance with U.S. export controls.

According to TrendForce’s forecast in mid-July, the shipment share of AI servers equipped with self-developed ASIC chips in 2024 is expected to exceed 25%, while NVIDIA owning the lion’s share of 63.6%. AMD’s market share, on the other hand, is projected to reach 8.1% in 2024.

Read more

(Photo credit: NVIDIA)

Please note that this article cites information from The Korea Economic DailyThe Information, SemiAnalysis and Overclocking.
  • Page 15
  • 57 page(s)
  • 284 result(s)

Get in touch with us