- OpenAI invests $500 billion in Stargate, funding huge AI data centers
- Each Stargate site receives a community plan tailored to local needs
- Cloud hosting and web hosting can benefit from predictable operating energy costs
OpenAI has unveiled a plan aimed at limiting the impact of its Stargate data centers on local electricity costs.
The new guidelines will see each site operate under a community plan developed with input from residents and regulators.
This approach includes directly financing new energy and storage infrastructure or investing in energy generation and transmission resources as needed.
Investments in electricity aim to alleviate local energy tension
The goal is to ensure that local utility bills do not increase due to the operations of these large-scale data centers.
The Stargate initiative is a $500 billion, multi-year program to build AI data centers across the United States to support AI training and inference workloads and handle some of the industry's most demanding computational tasks.
OpenAI's efforts mirror moves by other technology companies, such as Microsoft, which recently announced measures to reduce water use and limit impacts on electricity costs in its own data centers.
By financing energy infrastructure and working closely with local utilities, these companies aim to avoid additional financial burdens on surrounding communities.
Each Stargate site will have a customized plan that reflects the specific energy requirements of its location.
This could involve funding the installation of additional energy storage systems or expanding local generation capacity.
OpenAI says it will fully cover energy costs resulting from its operations rather than passing them on to residents or businesses.
Cloud hosting and web hosting on these sites should benefit from predictable operating costs, while AI tools can operate at scale without disrupting on-premises infrastructure.
Reports indicate that AI-powered data centers could nearly triple electricity demand in the United States by 2035, putting pressure on regional power grids and raising utility bills for consumers.
U.S. lawmakers have criticized technology companies for relying on utilities while residential and small business customers absorb the cost of grid upgrades.
Volatile demand for AI workloads, such as running large language models or other cloud-based AI services, further complicates power planning.
Without proactive investment, electricity costs could rise dramatically in regions that host multiple data centers.
OpenAI's community plan also reflects the growing challenge of energy access for AI development.
Large-scale AI tools consume much more power than typical cloud services or web hosting workloads, making infrastructure planning essential.
By directly funding energy upgrades and coordinating with local utilities, OpenAI aims to reduce risks to both the power grid and nearby communities.
Through Bloomberg
Follow TechRadar on Google News and add us as a preferred source to receive news, reviews and opinions from our experts in your feeds. Be sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp also.





