Samsung expects highest quarterly profit in over 3 years, lifted by AI chip demand

News Summary
Samsung Electronics on Tuesday projected a 32 percent year-on-year rise in its third-quarter operating profit, significantly beating analysts' estimates to reach 12.1 trillion won (US$8.5 billion), marking its highest quarterly profit in over three years. This robust performance was primarily driven by strong demand for server and AI-related chips, which boosted prices and shipments of conventional DRAM and NAND products. Despite slower-than-expected progress in supplying advanced High-Bandwidth Memory (HBM) chips to major clients like Nvidia, gains in commodity memory, also supported by tight supplies, helped cushion the impact. Furthermore, the foundry unit narrowed its losses as higher utilization rates helped ease fixed-cost pressures.
Background
Samsung Electronics is the world's largest memory chipmaker and one of the leading electronics manufacturers globally. Its diverse business portfolio includes memory chips, system LSI, foundry services, display panels, mobile communications, and home appliances. In recent years, the rapid advancement of artificial intelligence technology has led to an explosive increase in demand for high-performance AI chips and their accompanying High-Bandwidth Memory (HBM). HBM is a critical component for AI accelerators (such as GPUs), essential for processing large-scale parallel computations. While the global memory chip market has experienced cyclical fluctuations, AI-driven demand is considered a significant engine for current and future growth.
In-Depth AI Insights
What does Samsung's performance reveal about the true dynamics of the current AI chip market? - Samsung's stronger-than-expected profit suggests that AI-driven demand for servers and related chips is not solely concentrated on cutting-edge HBM, but has broadly boosted the entire memory market, including conventional DRAM and NAND products. - This reflects the breadth and depth of AI infrastructure build-out, encompassing not only the extreme HBM demand for training but also the continuous pull for high-capacity, high-performance conventional memory for inference and data center upgrades. - Despite