SK Hynix extends lead over rivals as it readies production for cutting-edge memory chips

Asia (excl. Greater China & Japan)
Source: CNBCPublished: 09/12/2025, 03:45:01 EDT
SK Hynix
Nvidia
High-Bandwidth Memory
AI Chips
Semiconductor Manufacturing
A man walks past a logo of SK Hynix at the lobby of the company’s Bundang office in Seongnam on January 29, 2021.

News Summary

South Korean memory chipmaker SK Hynix announced it is ready for mass production of its next-generation High-Bandwidth Memory (HBM) chips, HBM4, solidifying its lead over rivals Samsung Electronics and Micron in the AI chip market. HBM chips are crucial components for artificial intelligence computing chipsets, with global AI giant Nvidia being a major client of SK Hynix. HBM4, the sixth generation of HBM technology, boasts doubled bandwidth and a 40% increase in power efficiency compared to its predecessor. Analysts expect HBM4 to be the main AI memory chip required for Nvidia's next-generation Rubin architecture. While Micron has also shipped HBM4 samples and Samsung is reportedly seeking Nvidia certification, analysts anticipate SK Hynix's dominance to persist into next year. Following the announcement, SK Hynix shares surged over 7% on Friday, reaching their highest level since 2000 and bringing year-to-date gains to nearly 90%. The company posted record operating profit and revenue for its second quarter, driven by strong HBM demand which accounted for 77% of its overall revenues. SK Hynix expects to double its HBM sales in 2025 compared to 2024, with AI demand projected to continue growing into 2026.

Background

High-Bandwidth Memory (HBM) is an advanced type of Dynamic Random Access Memory (DRAM) technology designed to meet the escalating memory demands of artificial intelligence (AI) computing. It achieves significantly higher bandwidth and lower power consumption by stacking multiple DRAM chips close to the processor, thereby enhancing data processing capabilities. SK Hynix, Samsung Electronics, and Micron Technology are the world's leading memory chip manufacturers, dominating the DRAM and NAND flash markets. The rapid advancement of AI technologies, particularly large language models and generative AI, has led to an explosive demand for high-performance AI chips (like Nvidia's GPUs) and their accompanying HBM memory, making HBM a new focal point of competition in the memory chip industry.

In-Depth AI Insights

What are the broader strategic implications of SK Hynix's HBM4 lead for the global AI supply chain and competitive landscape? - This further solidifies Nvidia's dominant position in the AI chip sector by securing a critical component from a reliable, leading supplier. This tight collaboration could create entry barriers for other AI chip designers. - It intensifies pressure on Samsung and Micron to innovate faster; if they fail to close the gap, they risk becoming secondary suppliers in the high-growth AI memory segment, impacting their long-term profitability and market share. - Given South Korea's lead in advanced memory technology, this could further entrench its role as a critical node in the global AI hardware ecosystem, granting it greater influence in global tech competition. How might the incumbent Trump administration's 'America First' and US-China tech rivalry policies impact the long-term HBM supply chain dynamics, even for a non-US company like SK Hynix? - Although SK Hynix is a South Korean company, its primary client, Nvidia, is US-based, and HBM chips ultimately serve global markets, including China. The US government could indirectly influence HBM chip production and sales strategies through export controls or technology alliances. - The Trump administration might seek to reshore more high-tech manufacturing, including critical semiconductor components, to the US or its allies. This could prompt SK Hynix to consider expanding production in the US or other non-Chinese regions to mitigate potential geopolitical risks. - If US-China tech decoupling intensifies, SK Hynix could face pressure to offer different product versions for different markets or restrict sales of high-performance HBM chips to Chinese customers, impacting its global market strategy and revenue growth. What do the high concentration and SK Hynix's lead in the HBM market mean for downstream applications, particularly AI data center operators and AI startups, in terms of risks and opportunities? - Risks: The concentrated HBM supply could lead to increased price volatility and put downstream customers at a disadvantage in terms of bargaining power. Any production disruption at SK Hynix or its key foundries would have significant repercussions for the entire AI ecosystem. - Risks: The rapid pace of technological iteration means AI data centers must continuously upgrade hardware to remain competitive, and HBM supply bottlenecks could limit their ability to rapidly expand and deploy the latest AI models. - Opportunities: SK Hynix's close collaboration with Nvidia ensures optimized performance and timely supply of the latest generation of AI chips, providing a guarantee of technological advancement for customers requiring cutting-edge AI computing capabilities. Early adopters can deploy more efficient AI solutions faster. - Opportunities: For AI startups, while initial costs may be higher, a stable and high-performance HBM supply means they can focus on software and algorithm innovation without excessive concern for underlying hardware performance bottlenecks.