Micron Predicts 300GB RAM Cars Will Drive Next Memory Shortage
#Chips

Micron Predicts 300GB RAM Cars Will Drive Next Memory Shortage

Chips Reporter
4 min read

Micron CEO Sanjay Mehrotra forecasts autonomous vehicles will need 300GB of RAM, potentially creating new memory shortages while driving 'robust long-term growth' in automotive memory demand.

Micron CEO Sanjay Mehrotra has made a bold prediction that could reshape both the automotive and semiconductor industries: cars with Level 4 autonomy will eventually require more than 300GB of RAM to function effectively. This forecast comes as Micron reports record-breaking quarterly earnings driven by AI demand, suggesting the next major memory shortage might come not from data centers but from AI supercomputers on wheels.

Record Earnings Fuel Expansion Plans

Micron's latest financial results underscore the company's dominant position in the memory market. The company reported $23.86 billion in revenue for the second quarter of fiscal year 2025, representing a staggering 200% increase from the $8.03 billion posted in the same quarter last year. This explosive growth stems from "structural supply constraints and Micron's strong execution across the board," particularly in high-bandwidth memory (HBM) chips that power AI hyperscalers' infrastructure.

The financial windfall is enabling aggressive expansion. Micron is developing several new fabrication facilities across Japan, Singapore, and a massive "megafab" in New York, with production scheduled to begin between 2028 and 2029. The company also plans to boost output by 20% in 2026, moves that could help alleviate current supply pressures in the high-performance memory market.

The 300GB RAM Challenge

But even with this expansion, Mehrotra sees a new demand driver emerging: autonomous vehicles. Current vehicles typically require around 16GB of memory for basic computing functions, but Level 4 autonomous systems present an entirely different computational challenge.

Level 4 autonomy represents a significant leap from current driver assistance systems. While Level 2 systems like Tesla's Autopilot and Cadillac's Super Cruise can control steering and acceleration, Level 4 vehicles can handle all driving tasks without human intervention in most conditions. The driver can still take control if desired, but the vehicle manages everything from highway merging to navigating busy intersections.

This level of autonomy demands massive computational power. Nvidia is already partnering with major automakers including BYD, Geely, Isuzu, and Nissan to deploy its Drive Hyperion platform, an end-to-end autonomous vehicle system that relies heavily on AI processing. Similar to how AI data centers consume enormous amounts of high-bandwidth memory, these automotive AI systems will require substantial memory bandwidth and capacity to process sensor data, make real-time decisions, and ensure passenger safety.

Memory Shortages: The Next Frontier

The automotive memory demand scenario mirrors what's happening in the PC market today. Apple recently removed its $4,000 512GB Mac Studio from online stores due to overwhelming demand from users wanting to run AI applications locally. The 256GB version saw its price jump to $2,000 as consumers discovered that AI workloads require substantial unified memory.

If automotive AI follows a similar adoption curve, the implications are significant. Hundreds of thousands or millions of vehicles equipped with Level 4 autonomy would create a new, massive consumer of high-performance memory. Unlike data centers where equipment can be upgraded or expanded, automotive memory is typically fixed at manufacturing, meaning the industry must anticipate demand years in advance.

Timeline and Market Implications

While the 300GB RAM vehicle future seems inevitable, several factors could slow adoption. Vehicles with Level 4 autonomy remain expensive, and regulatory frameworks for fully autonomous vehicles are still evolving across different jurisdictions. The technology also faces public acceptance challenges and liability questions that could delay widespread deployment.

However, if and when adoption accelerates, the memory industry faces a critical decision point. Current expansion plans may not be sufficient to meet both AI data center demand and automotive requirements. The industry could see another round of memory shortages, this time driven not by cryptocurrency mining or smartphone demand, but by AI supercomputers on wheels.

The Growth Opportunity

Despite the potential for shortages, Mehrotra frames this development positively, calling it "robust long-term growth in automotive memory demand." The automotive sector represents a new, stable market for high-margin memory products, potentially diversifying away from the cyclical nature of consumer electronics and providing a hedge against market downturns.

For consumers, this means future vehicles will be more capable but also more expensive. The cost of 300GB of high-performance memory, combined with the processors and sensors needed for Level 4 autonomy, will likely add thousands to vehicle prices. However, the promise of safer, more convenient transportation may justify these costs for many buyers.

As Micron and other memory manufacturers race to expand capacity, the industry stands at a crossroads. Will the planned expansions be enough to satisfy both AI data centers and autonomous vehicles? Or will the automotive sector's entry into high-performance computing create the next great memory shortage? The answer will shape technology availability and pricing for years to come.

Featured image

a Waymo driverless taxi

HBM3E vs HBM4

Jowi Morales

Comments

Loading comments...