
Follow ZDNET: Add us as a preferred source on Google.
ZDNET's key takeaways
- The data centers will collectively provide 5.5 gigawatts of power.
- They're the latest phase of the so-called Project Stargate.
- Data centers require massive quantities of water for cooling.
OpenAI is pushing forward with its mission to power increasingly large AI models with the construction of five new data centers across the US, the company announced Tuesday.
Also: How Nvidia and OpenAI's staggering $100 billion deal could fuel a new age of AI
The announcement of the new data centers marks the latest development in Project Stargate, the AI infrastructure plan OpenAI launched(?) in January alongside the Trump administration, Oracle, SoftBank, and Emirati investment firm MGX. The project secured $500 billion in funding to build data centers over the next four years that will cumulatively generate 10 gigawatts of power -- enough to fuel roughly 7,500,000 homes for one year.
The five new data centers -- which will be built, respectively, in Shackelford County, Texas; Dona Ana County, New Mexico; Lordstown, Ohio; Milam County, Texas; and an undisclosed location in the Midwest -- will provide 7 gigawatts of power to be put toward OpenAI's proprietary AI models. According to OpenAI, that number keeps Stargate "ahead of schedule" to get to 10 gigawatts by the end of 2025.
"Together, these sites are expected to create over 25,000 onsite jobs, and tens of thousands of additional jobs across the US," OpenAI wrote in its announcement, adding that it's still evaluating additional sites across the country. US chipmaker Nvidia announced on Monday that it would invest up to $100 billion in OpenAI for the construction of more data centers.
Also: Will AI damage human creativity? Most Americans say yes
(Disclosure: Ziff Davis, ZDNET's parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)
A race to build
As tech companies' ambitions amid the ongoing AI boom have escalated, so too have their energy bills.
Also: Worried about AI's soaring energy needs? Avoiding chatbots won't help - but 3 things could
It takes a lot of energy to train and run AI models. Developers have been locked in a race with China to stay ahead of AI development, which means building massive local data centers filled with graphics processing units (GPUs) to power AI tools like ChatGPT and Gemini and develop more advanced models. That race has transformed US chipmaker Nvidia into the wealthiest company on the planet.
It also has major environmental implications. Data centers are prone to overheating, which means they require huge quantities of water for cooling. On top of that, they're often constructed in arid Western states, which have abundant land but are also prone to seasonal droughts. Many data centers are powered by fossil fuels, exacerbating global greenhouse effect and climate change.
Microsoft announced plans last September to reopen Three Mile Island, the nuclear power plant and the site of the infamous partial reactor meltdown in 1979, to power its own AI ambitions.
Also: These consumer-facing industries are the fastest adopters of AI agents
OpenAI CEO Sam Altman and other AI industry leaders tend to justify the economic and ecological costs of building data centers by claiming that the long-term benefits of the technology will be worth doing everything within our power right now to build infrastructure as quickly and as copiously as possible.
As Altman wrote in a personal blog post on Tuesday: "Maybe with 10 gigawatts of compute, AI can figure out how to cure cancer. Or with 10 gigawatts of compute, AI can figure out how to provide customized tutoring to every student on earth. If we are limited by compute, we'll have to choose which one to prioritize; no one wants to make that choice, so let's go build."
"Maybe" is very much the operative word there.