Saturday, November 22, 2025 | 06:02 PM ISTहिंदी में पढें
Business Standard
Notification Icon
userprofile IconSearch

Nvidia supplier SK Hynix sells out 2025 chip supply as AI demand skyrockets

South Korean chipmaker SK Hynix, which supplies high-performance memory to Nvidia, reported record earnings in the third quarter on the back of AI boom

artificial intelligence, AI

The firm will begin supplying its latest HBM4 chips in the fourth quarter of this year. (Representational image)

Rimjhim Singh New Delhi

Listen to This Article

AI fever is rewriting the fortunes of chipmakers and SK Hynix is riding the wave. The South Korean company announced that it has already sold its entire semiconductor output for next year, as soaring global demand for advanced memory chips used in artificial intelligence (AI) systems continues to drive growth, Financial Times reported. SK Hynix, which supplies high-performance memory to Nvidia, reported record earnings in the third quarter on the back of the AI boom. 
Operating profit for the July-September period surged 62 per cent year-on-year to a record 11.4 trillion Won ($8 billion), in line with estimates from data analytics company LSEG SmartEstimate. Revenue rose 39 per cent to 22.4 trillion Won, supported by strong demand from AI data centres, the company said on Wednesday.
 
 

Tight supply in advanced memory chips

SK Hynix said its inventory of traditional DRAM (dynamic random-access memory) chips, which handle temporary data storage in devices, was “extremely tight”. The company said supply of high-bandwidth memory (HBM) chips, critical for AI processing, would remain insufficient as demand from AI applications grows rapidly, Financial Times reported. 
 

OpenAI deal boosts future outlook

Investor optimism rose further after SK Hynix and Samsung Electronics signed a preliminary deal with OpenAI this month to supply semiconductors for its ambitious $500 billion ‘Stargate’ data centre project. 
SK Hynix said that expected demand from the project exceeds twice the current global capacity for HBM chips. To meet this, the company plans to expand production capacity and set up dedicated systems to serve OpenAI’s needs. It has also finalised HBM supply contracts with other major clients for 2025 and plans to “substantially increase” capital expenditure, the news report said.
The firm will begin supplying its latest HBM4 chips in the fourth quarter of this year.
 

Growing demand for AI inference chips

The company said the growing use of AI inference, the process that allows chatbots and similar applications to generate responses, is fuelling greater demand for high-performance server memory. 
“We project AI inference memory demand to expand not only in the US but also in China as Chinese hyperscalers are expected to push for AI inference investment,” Citi analysts said in a recent report.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Oct 29 2025 | 5:04 PM IST

Explore News