
The Growing Energy Crisis in AI Training
Artificial intelligence development is hitting a serious energy wall, and the situation appears to be getting worse faster than many realize. According to Greg Osuri, founder of Akash Network, as AI models continue to grow in size and complexity, training them may soon require energy output comparable to an entire nuclear reactor. This isn’t just theoretical speculation—data centers are already consuming hundreds of megawatts of fossil fuel power, and the environmental costs are mounting.
Osuri shared these concerns during an interview at Token2049 in Singapore, suggesting that the industry significantly underestimates how quickly compute demands are doubling. The trend could potentially trigger a broader energy crisis, raising household power bills and adding millions of tons of new emissions each year. He went as far as to say that concentrated fossil fuel use around data hubs is already having health impacts on nearby communities.
Decentralization as a Potential Solution
The problem is becoming increasingly visible. Recent reports from Bloomberg highlight how AI data centers are sending power costs surging in the US, with wholesale electricity costs surging 267% in five years in areas near these facilities. This isn’t just affecting tech companies—everyday households are feeling the impact through rising energy bills.
Osuri believes the alternative lies in decentralization. Instead of concentrating chips and energy in massive data centers, he proposes distributed training across networks of smaller, mixed GPUs. This could range from high-end enterprise chips to gaming cards in home PCs, potentially unlocking both efficiency and sustainability benefits. The concept bears similarities to Bitcoin’s early mining days, where ordinary users could contribute processing power and get rewarded.
“Once incentives are figured out, this will take off like mining did,” Osuri noted, adding that home computers may eventually earn tokens by providing spare compute power. This approach could give everyday people a stake in AI’s future while lowering costs for developers.
Technical and Economic Challenges Remain
Despite the potential, significant challenges exist. Training large-scale models across different types of GPUs requires technological breakthroughs in software and coordination—problems the industry is only beginning to address. Osuri mentioned that about six months ago, several companies started demonstrating aspects of distributed training, but no one has successfully integrated all these elements to run a complete model yet.
Perhaps the bigger challenge lies in creating fair incentive systems. “The hard part is incentive,” Osuri explained. “Why would someone give their computer to train? What are they getting back? That’s a harder challenge to solve than the actual algorithm technology.”
Still, Osuri insists that decentralized AI training is becoming a necessity rather than an option. By spreading workloads across global networks, AI could potentially ease pressure on energy grids, reduce carbon emissions, and create a more sustainable AI economy. The question remains whether the technology and incentive structures can develop quickly enough to address the growing energy concerns.