2 / 1075

Your Next AI Query May Travel Where the Power Is

TL;DR

The rise of electricity-guzzling data centers has forced the AI industry to get creative about finding power. Nvidia is teaming up with InfraPartners, Prologis, and nonprofit EPRI to build about 25 micro data centers (5–20 MW each) next to utility substations at five US utilities. Compute shifts automatically to wherever spare power is available — if one substation is overloaded or offline, the workload moves to another with capacity.

Nauti's Take

Promising approach: instead of building monster data centers, Nvidia spreads AI workloads across 25 smaller sites wired directly into the grid — easing pressure on overloaded substations and making the network more resilient. The catch: 5–20 MW each is still a massive load, and 'shiftable compute' only works for training or batch jobs, not real-time inference.

A solid pilot for utilities and hyperscalers; teams that need predictable latency should watch carefully.

Sources