DRAM prices set to nearly double in Q1 2026 as AI demand strains memory supply chains
#Trends

DRAM prices set to nearly double in Q1 2026 as AI demand strains memory supply chains

Privacy Reporter
4 min read

DRAM prices are forecast to surge 90-95% in Q1 2026 as AI infrastructure expansion and PC demand create unprecedented memory shortages, with analysts warning prices will remain elevated through 2028.

The memory shortage gripping the tech industry is set to worsen dramatically, with DRAM prices expected to nearly double in the first quarter of 2026 as artificial intelligence infrastructure expansion collides with traditional PC demand to create unprecedented supply constraints.

Featured image

According to industry analysts at TrendForce, contract prices for DRAM memory are now forecast to surge by 90-95 percent quarter-over-quarter during Q1 2026, a significant upward revision from their earlier prediction of 55-60 percent growth. The situation for NAND flash storage is similarly dire, with prices expected to increase by 55-60 percent during the same period.

The dramatic price increases reflect a perfect storm of demand factors that have caught the memory industry off guard. While AI-driven hyperscalers and cloud service providers have been aggressively expanding their infrastructure to support growing inference workloads, higher-than-expected PC shipments in Q4 2025 have further strained already tight supply chains.

The AI Infrastructure Boom Driving Memory Demand

The shift from AI model training to inference workloads has created a voracious appetite for both DRAM and storage. During large language model inference, the model state is stored in what's called the key-value cache – essentially the model's short-term memory. This KV cache must be maintained in memory to enable rapid responses during chatbot sessions and other AI interactions.

When sessions are active, this cache is typically stored in high-bandwidth memory (HBM). However, when sessions idle, the precomputed KV cache is pushed to slower system memory and eventually to storage tiers. This multi-tier memory architecture, while efficient for AI workloads, requires massive amounts of memory capacity that traditional forecasting models didn't anticipate.

"The demand for high-performance storage has far surpassed initial expectations as AI applications driven by inference continue to grow," TrendForce noted in its analysis. "Since late 2025, leading North American CSPs have been rapidly increasing their procurement, resulting in a surge of enterprise SSD orders."

PC Market Surprises Exacerbate Shortages

Adding to the AI-driven demand, the PC market delivered unexpected growth in Q4 2025, catching memory suppliers off guard. Original equipment manufacturers like Dell and HP typically purchase memory in bulk about a year in advance of demand, which explains why pre-built system pricing has remained relatively stable while standalone memory kits have tripled in price.

As these OEM inventories begin to draw down and restocking begins, system prices are expected to climb significantly. TrendForce now predicts PC DRAM prices will roughly double from the holiday quarter, with similarly steep increases forecast for LPDDR memory used in notebooks and smartphones.

Perhaps most dramatically, pricing on LPDDR4x and LPDDR5x memory is expected to increase by roughly 90 percent quarter-over-quarter – described by TrendForce as "the steepest increases in their history." This surge is particularly concerning given that LPDDR memory has traditionally been used primarily in notebooks, but is now finding its way into high-performance rack systems.

The Nvidia Factor and Enterprise Storage Surge

Nvidia's most powerful rack systems, which contain 54 terabytes of LPDDR5x memory each, are contributing significantly to the memory crunch. As AI infrastructure continues its transition from training-dominated to inference-dominated workloads, the additional DRAM and storage requirements are creating unprecedented pressure on supply chains.

Enterprise SSD orders have surged as hyperscalers and CSPs scramble to deploy as many solid-state drives as possible to support AI inference workloads. The combination of DRAM for active inference processing and NAND flash for storage of precomputed model states is creating a dual-pronged assault on memory supply.

No Relief in Sight for Years

For those hoping for relief from the current "memory winter," the outlook is grim. While memory vendors now have the capital to invest in new fabrication facilities, these plants will take years to bring online. The industry is facing a multi-year timeline before supply can catch up with demand.

DRAM prices are expected to peak later in 2026, but TrendForce forecasts that prices will remain elevated through 2028 before returning to normal levels. This extended period of high prices will have significant implications for everything from consumer electronics to enterprise infrastructure budgets.

The current situation serves as a stark reminder of how rapidly AI infrastructure demands can reshape traditional supply chains. What began as a gradual increase in memory requirements has transformed into a full-blown crisis that will affect everything from smartphone prices to datacenter expansion plans for years to come.

As one industry observer noted, the message is clear: "Buy servers now or cry later." The DRAM price spike threatens to derail infrastructure budgets across the tech industry, forcing difficult decisions about when and how to expand AI capabilities in an environment of constrained and expensive memory resources.

Comments

Loading comments...