Wednesday, January 21, 2026 | 06:32 PM ISTहिंदी में पढें
Business Standard
Notification Icon
userprofile IconSearch

How AI boom is exhausting memory supply, sending RAM prices soaring

Explosive demand from Nvidia and other AI chipmakers has soaked up global memory supply, pushing DRAM and HBM prices to unprecedented levels

A chip off the quantum block: Microsoft's breakthrough reshapes computing

Modern AI processors surround their graphics processing units with high-bandwidth memory, or HBM, a specialised form of RAM designed to move data extremely quickly. (Illustration: Binay Sinha)

Rishabh Sharma New Delhi

Listen to This Article

Global supplies of computer memory are running short as artificial intelligence chipmakers consume far more capacity than the industry can currently produce. Companies such as Nvidia, Advanced Micro Devices and Google are absorbing vast amounts of RAM for AI accelerators, leaving little supply for other uses, according to a CNBC report.
 
Micron Technology, SK Hynix and Samsung Electronics dominate the global memory market, and all are benefiting from the surge. “We have seen a very sharp, significant surge in demand for memory, and it has far outpaced our ability to supply that memory,” CNBC quoted Micron business chief Sumit Sadana as saying.
 
 

Why it matters

 
Memory is a critical input for AI systems, and shortages are now reshaping pricing, investment and product strategies across the technology industry, from data centres to consumer laptops and gaming devices.
 
TrendForce, a Taipei-based research firm, expects average DRAM prices to rise between 50 per cent and 55 per cent in the current quarter compared with the fourth quarter of 2025. Analyst Tom Hsu, cited by CNBC, described the scale of the increase as “unprecedented”.
 

How AI chips are changing memory demand

 
Modern AI processors surround their graphics processing units with high-bandwidth memory, or HBM, a specialised form of RAM designed to move data extremely quickly. Nvidia’s latest Rubin GPU can carry up to 288 gigabytes of next-generation HBM4 memory per chip, installed in multiple visible blocks around the processor.
 
Producing HBM is complex and resource-intensive. Micron stacks between 12 and 16 layers of memory into a single unit, or “cube”. As a result, manufacturing one bit of HBM requires sacrificing the capacity to make roughly three bits of conventional memory used in consumer devices. “As we increase HBM supply, it leaves less memory left over for the non-HBM portion of the market,” Sadana said.
 

Who gets priority

 
Memory makers are prioritising server and AI customers, where demand growth is strongest and buyers are less sensitive to price increases. In December, Micron said it would scale back parts of its consumer PC memory business to preserve supply for AI chips and data centres.
 
SK Hynix has said it has already secured demand for its entire 2026 RAM output, while Samsung expects its quarterly operating profit to nearly triple on the back of higher memory prices.
 

The ‘memory wall’ problem

 
AI researchers warn that memory constraints are becoming a key bottleneck for large language models. “Your performance is limited by the amount of memory and the speed of the memory that you have,” Sha Rabii, co-founder of Majestic Labs, told CNBC. The industry refers to this constraint as the “memory wall”, where processors sit idle waiting for data.
 
More memory allows AI systems to run larger models, handle more users at once and retain longer conversational context. Majestic Labs is designing systems with far larger memory pools to address this limitation, though it plans to rely on lower-cost alternatives to HBM.
 

Impact on consumers

 
Rising memory prices are affecting consumer electronics as well. Memory now accounts for roughly 20 per cent of a laptop’s hardware cost, up from about 10-18 per cent earlier in 2025, according to TrendForce.
 
Dell Technologies has warned that higher memory costs will raise its overall cost base, while Apple has acknowledged a mild pricing impact so far. Even Nvidia faces questions over whether AI-driven memory demand could push up prices for gaming graphics cards.
 

What’s next

 
Chipmakers are building capacity, but relief may still be some years away. Micron is building new fabrication plants in Idaho and New York, with production expected from 2027 onwards. Until then, supply remains tight. 
 
“We’re sold out for 2026,” Sadana said.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Jan 11 2026 | 9:56 AM IST

Explore News