A new analysis frames AI datacenter water consumption in terms of In-N-Out burgers, revealing how the industry's environmental impact is being measured in increasingly unconventional—and potentially misleading—ways.
The conversation around AI datacenter sustainability has taken a peculiar turn. A recent analysis from SemiAnalysis reframes the water consumption of Colossus 2—one of the world's largest AI datacenters—in terms of In-N-Out burger restaurants, suggesting it uses as much water annually as 2.5 average locations, assuming only drinkable water and burgers. This unusual metric highlights a growing trend: as AI infrastructure scales, the industry is struggling to communicate its environmental impact in ways that resonate with both technical and public audiences.
The analysis comes as Colossus 2, a massive facility operated by xAI (Elon Musk's AI company), continues its expansion. The facility is designed to support training and inference for large language models, requiring substantial cooling infrastructure. Traditional datacenter metrics focus on power usage effectiveness (PUE) or water usage effectiveness (WUE), but these technical measurements often fail to translate to public understanding. The In-N-Out comparison attempts to bridge that gap, though it raises questions about what constitutes meaningful environmental accounting.

The Water Reality Behind AI Training
Modern AI datacenters consume water primarily through evaporative cooling systems, which use water to dissipate heat generated by thousands of GPUs running in parallel. A typical hyperscale datacenter can use millions of gallons annually. For context, Google's global datacenters consumed approximately 5.2 billion gallons of water in 2021, according to their environmental report. Colossus 2, being one of the largest facilities dedicated to AI training, likely falls within this range or exceeds it.
The In-N-Out comparison serves two purposes. First, it makes abstract consumption tangible—most people understand what a burger restaurant uses. Second, it subtly critiques the scale of AI infrastructure by comparing it to a familiar, resource-intensive business. However, this framing oversimplifies the issue. In-N-Out restaurants use water for cooking, cleaning, and customer needs, while datacenters primarily use it for cooling. The comparison also assumes "drinkable water," which isn't always the case—some datacenters use non-potable sources or recycled water.
The Metrics Battle: Tokens per Watt vs. Tokens per Burger
The analysis explicitly states: "Forget tokens/watt or tokens/dollar, it's about tokens/burger." This reflects a broader shift in how the industry measures efficiency. Tokens per watt measures computational efficiency—how many AI outputs you get per unit of energy. Tokens per dollar measures economic efficiency. But tokens per burger? That's a new, almost satirical metric that attempts to capture the "real-world" cost of AI in terms of everyday resources.
Critics argue this approach is misleading. Water consumption varies dramatically by region and climate. Datacenters in arid areas like Arizona face stricter water constraints than those in water-rich regions like the Pacific Northwest. The In-N-Out comparison doesn't account for these geographical differences. Moreover, it ignores that many datacenters are increasingly using water-efficient cooling technologies, such as liquid immersion cooling or air-cooled systems, which reduce or eliminate water use.
Counter-Perspectives: Efficiency Gains and Alternative Metrics
Not everyone agrees that water consumption is the primary concern. Some industry experts point out that AI's energy efficiency is improving rapidly. For example, newer GPUs like Nvidia's H100 and Blackwell architectures deliver more performance per watt than previous generations. If AI models become more efficient, the water needed per token could decrease over time, even as total consumption rises.
Additionally, the focus on water might distract from other environmental impacts. Datacenters also consume significant energy, contribute to electronic waste, and require raw materials for construction. A holistic sustainability assessment would consider all these factors, not just water. The In-N-Out metric, while attention-grabbing, risks oversimplifying a complex issue.
The Broader Trend: Environmental Accountability in AI
The Colossus 2 analysis is part of a larger movement demanding greater transparency from AI companies. As AI models grow larger and more resource-intensive, stakeholders—from investors to regulators—are asking harder questions about environmental costs. Some companies, like Google and Microsoft, have published detailed sustainability reports, but others, particularly private firms like xAI, share less information.
This lack of transparency fuels speculative analyses like the In-N-Out comparison. Without official data from xAI, analysts must make assumptions, leading to potentially skewed comparisons. The tech community is increasingly calling for standardized environmental metrics for AI infrastructure, similar to how financial reporting has standardized measures like revenue and profit.
Looking Ahead: What Comes Next?
The debate over datacenter sustainability isn't going away. As AI continues to permeate every industry, the demand for compute will only grow. The challenge is balancing this growth with environmental responsibility. Some potential solutions include:
- Improved Cooling Technologies: Liquid immersion cooling can reduce water use by up to 90% compared to traditional evaporative systems.
- Renewable Energy Integration: Pairing datacenters with solar or wind farms can reduce the carbon footprint, though water use remains a separate issue.
- Location Strategy: Building datacenters in cooler climates or near water sources can minimize environmental impact.
- Regulatory Pressure: Governments may impose stricter water usage limits on datacenters, especially in drought-prone regions.
The In-N-Out comparison, while quirky, serves as a reminder that the AI industry's environmental impact can't be hidden behind technical jargon. As the technology evolves, so must the metrics we use to measure its costs—whether in tokens, watts, or burgers.

Comments
Please log in or register to join the discussion