HBM gold mine sends SK hynix revenues to a new high point

High Bandwidth Memory (HBM) is high revenue memory as far as Korea’s SK Hynix is concerned; revenues in the third 2025 quarter were ₩24.449 trillion ($17.1 billion), up 39 percent year/year, with a net profit of ₩12.598 trillion ($8.8 billion), up 118.9 per cent.

Just over half of every dollar of its revenue is pure profit. SK hynix’ internal operating profit measure exceeded ₩10 trillion ($7 billion) for the first time in the company’s history. It was a record quarterly revenue performance, driven by massive 12-high HBM3E sales and DDR5 server memory as well. Shipments of high-capacity DDR5 of 128GB or more have more than doubled from the previous quarter.

CFO Kim Woohyun said: “With the innovation of AI technology, the memory market has shifted to a new paradigm and demand has begun to spread to all product areas.”

It’s likely going to continue in this vein. SK hynix says HBM supply discussions for next year are completed. Next generation HBM4 shipments will begin in Q4 this year, with full-scale sales expansion planned for next year. In fact, demand for all DRAM and NAND products has been secured for next year. That means SK hynix’s HBM, standard DRAM, and NAND output is all effectively sold-out through 2026. 

The profit margin in Q3 2025 is remarkable.

There is such an extraordinarily high level of memory demand because of the AI-inspired data centre building boom that memory supply cannot keep up with demand, resulting in price – and profit – rises. One effect of this is that the company’s cash and cash equivalents at the end of the third quarter increased by ₩110.9 trillion ($77.6 billion) from the previous quarter, reaching ₩127.9 trillion ($89.5 billion). Meanwhile, interest bearing debt stood at ₩124.1 trillion ($86.9 billion), enabling the company to successfully transition to a net cash position of ₩13.8 trillion ($9.7 billion). 

These are revenue growth rates to die for.

These numbers are extraordinarily high, and likely to rise even higher, because, the company says: “As the AI market rapidly shifts toward inference-driven workloads, there is growing interest in distributing computational loads of AI servers across broader infrastructures such as general servers. This trend is expected to further expand demand across the entire memory portfolio, including high-performance DDR5 and eSSD.” And, “the recent wave of strategic partnerships and AI data center expansion announcements by leading global AI players provides further momentum.”

SK hynix plans to increase its memory production capacity and will increase output of its highest density 321-layer TLC and QLC NAND products. Wedbush analyst Matt Bryson expects 321-layer products will account for > 50 percent of the company’s NAND bit output by the end of 2026.

These results indicate that other memory suppliers, meaning Samsung and Micron, should report good results as well.