This surge in data center construction is not without consequences. The energy required to train and operate AI models is immense; for instance, training a single chatbot like ChatGPT can consume as much electricity as 100 homes do in a year. Such demands are straining local power grids, prompting fears of increased electricity costs for residents and complicated planning for utilities already struggling to modernize. Yet the concerns don’t stop at energy usage.
Gross’s reporting raises a deeper and more troubling question: Who pays for all of this? Across the country, local governments are offering generous subsidies to attract data centers, much as they’ve done for stadiums, hotels and film productions. These deals often include tax breaks, infrastructure improvements and other incentives—all paid for with public dollars. And while the pitch is always that the investment will pay off in jobs and economic growth, those returns are rarely guaranteed.
What’s lost in the excitement is the cost to taxpayers. Local officials, chasing headlines and ribbon-cuttings, may forget that their first job is to provide reliable services to the people who already live there. Every dollar handed to a tech company is a dollar not spent on filling potholes, improving schools or maintaining fire stations. These trade-offs aren’t abstract—they show up in the form of crowded classrooms, slow emergency response times and rising property taxes.
We all benefit from technological progress, and the promise of AI is real. But if the cost of that progress is gutting our civic infrastructure to chase tomorrow’s industry, we ought to think twice. Public money should serve public needs. That’s not anti-growth or anti-technology—it’s simply good governance. As always, the challenge isn’t choosing between growth or no growth. It’s deciding who benefits, who pays and whether the trade is worth it.