According to reports from the Industrial Times on July 10, driven by major players SK Hynix, Samsung, and Micron, the total monthly production capacity of High Bandwidth Memory (HBM) chips is expected to reach 540,000 units in 2025, a 105% increase from 2024’s 276,000 units.
HBM, a high-performance DRAM based on 3D stacking technology, is used in applications requiring high memory bandwidth, such as high-performance GPUs, network devices, and AI-specific integrated circuits in data centers.
This technology significantly reduces power and space requirements.
HBM is the most expensive component in AI acceleration cards. For example, a teardown of Nvidia’s H100 chip revealed a material cost of around $3000, with HBM from SK Hynix accounting for $2000, or 66% of the total cost.
Current Status of Major Players:
- SK Hynix and Micron are the primary HBM suppliers, both utilizing 1beta nanometer processes and supplying Nvidia.
- Samsung is expected to complete certification of its 1Alpha nm process in the second quarter and begin shipping by mid-year.
Expansion Plans:
- Samsung is upgrading its Pyeongtaek plants (P1L, P2L, P3L) for DDR5 and HBM production and enhancing the Hwaseong plant for specialized industry needs.
- SK Hynix is upgrading its Icheon M16 line and the M14 line for DDR5 and HBM production, while its Wuxi plant is advancing its processes.
- Micron is increasing production capacity at its Hiroshima plant and introducing EUV processes in the long term.
HBM4 Standard to Double Channels and Boost Speed to 6.4Gbps
EDEC Solid State Technology Association announced that the HBM4 memory standard is nearing completion, featuring double the channel count of HBM3 and preliminary agreement on speeds up to 6.4Gbps.
These enhancements aim to improve bandwidth, reduce power consumption, and enhance bare-die/stack performance.
These advancements are crucial for applications requiring efficient large dataset processing and complex computations in fields like generative AI, high-performance computing, high-end graphics cards, and servers.
The HBM4 standard includes 24Gb and 32Gb layers, offering 4-high, 8-high, 12-high, and 16-high TSV stacks. The committee is also discussing higher frequencies. To support device compatibility, controllers will handle both HBM3 and HBM4.
JEDEC highlights that HBM4’s physical size is larger compared to HBM3. The standard’s comprehensive performance improvements are significant for AI chips, high-performance computing, and other demanding applications.
- Samsung Electronics plans to develop diverse custom HBM memory for HBM4.
- Reports suggest Samsung might use a 1c nm DRAM process for HBM4 to enhance energy efficiency.
- SK Hynix accelerates HBM4 memory production, aiming for a 2025 launch.
- SK Hynix may use TSMC’s 7nm process for the HBM4 base die.
- JEDEC may relax HBM4 height restrictions, reducing the need for hybrid bonding.
- AI chip demand surges, driving HBM memory prices up 500%.
Keep visiting for more such awesome posts, internet tips, and lifestyle tips & remember we cover,
“Everything under the Sun!”
Follow Inspire2rise on Twitter. | Follow Inspire2rise on Facebook. | Follow Inspire2rise on YouTube