News
According to a report from the South Korean newspaper "Korea Joongang Daily," following Micron's initiation of mass production of the latest high-bandwidth memory HBM3e in February 2024, it has recently secured an order from NVIDIA for the H200 AI GPU. It is understood that NVIDIA's upcoming H200 pr...
News
With numerous cloud computing companies and large-scale AI model manufacturers investing heavily in AI computing infrastructure, the demand for AI processors is rapidly increasing. As per a report from IJIWEI, the demand for HBM (High Bandwidth Memory), a key component among them, has been on the ri...
News
Amidst the AI frenzy, HBM has become a focal point for major semiconductor manufacturers, and another storage giant is making new moves. According to Korean media "THE ELEC", Samsung Electronics plans to establish an HBM development office to enhance its competitiveness in the HBM market. The si...
News
The surge in demand for NVIDIA's AI processors has made High Bandwidth Memory (HBM) a key product that memory giants are eager to develop. However, according to South Korean media DealSite cited by Wccftech on March 4th, the complex architecture of HBM has resulted in low yields, making it difficul...
News
The U.S. memory giant Micron Technology has started the mass production of high-bandwidth memory "HBM3e," which will be utilized in NVIDIA's latest AI chips. Micron stated on February 26th that HBM3e consumes 30% less power than its competitors, meeting the demands of generative AI applications...