News
Amid the AI trend, the significance of high-value-added DRAM represented by HBM continues to grow. HBM (High Bandwidth Memory) is a type of graphics DDR memory that boasts advantages such as high bandwidth, high capacity, low latency, and low power consumption compared to traditional DRAM chips. ...
News
Major Cloud Service Providers (CSPs) continue to see an increase in demand for AI servers over the next two years. The latest projections of TrendForce indicate a global shipment of approximately 1.18 million AI servers in 2023, with a year-on-year growth of 34.5%. The trend is expected to persist i...
News
Micron Technology, the U.S. memory giant, has surpassed Wall Street expectations in its projected revenue for the current quarter (December-February). The main factor contributing to this success is the robust demand from data centers, offsetting the sluggish recovery in the PC and smartphone market...
News
According to the expreview's report, due to the surge in demand for AI applications and the market's need for more powerful solutions, NVIDIA plans to shorten the release cycle of new products from the original 2 years to 1 year. Regarding its HBM partner, although validations for various samples ar...
News
On November 13, NVIDIA unveiled the AI computing platform HGX H200, featuring the Hopper architecture, equipped with H200 Tensor Core GPU and high-end memory to handle the vast amounts of data generated by AI and high-performance computing. This marks an upgrade from the previous generation H100,...