“Smart” technology, brutal consumption – The hidden price of AI
The invisible burden of AI: increasingly expensive, increasingly dirty
Artificial intelligence is no longer the future – it is the present. Images, texts, and code are generated in seconds; chatbots assist in shopping, healthcare, and travel planning. While the digital world is visibly transforming, a less discussed yet crucial issue comes to the fore: what powers all this?
The answer: electricity. Lots of it.
More and more data centers are being built globally, with increasing capacity and increasingly intensive machine learning processes. The global AI infrastructure already rivals the electricity consumption of entire countries – and this is just the beginning.
Opaque systems, runaway costs
According to the latest analysis by MIT Technology Review, data on AI energy consumption are alarmingly incomplete. After processing dozens of reports and conducting interviews with industry experts, the conclusion was clear: neither the public nor regulators have a clear picture of how AI-powered services impact energy systems.
As the technology evolves, so must its power supply: Microsoft and Meta are working on launching new nuclear plants, while OpenAI’s Stargate program plans to build ten supercomputer centers, each consuming as much electricity as an entire U.S. state. Google alone has earmarked $75 billion for AI infrastructure in 2025.
One query – global footprint
How much energy does a single AI query require? At first glance, it seems negligible: generating an image, composing a response, suggesting an idea. But when these actions occur billions of times daily, the cumulative effect becomes massive. Moreover, AI is no longer confined to chatbots: it permeates search engines, recommender systems, digital assistants – virtually everything.
The issue isn’t just the volume of consumption but also the source. As demand surges, providers increasingly rely on quickly accessible yet polluting energy sources – such as natural gas – to meet capacity needs. This occurs just as the world aims to cut carbon emissions.
Training vs. inference – where lies the real burden?
Before an AI model can answer a question or generate an image, it must first learn how to do it – a process known as training, which is extremely energy-intensive. For example, training GPT-4 reportedly cost around $100 million and consumed 50 gigawatt-hours of electricity – equivalent to three days of power usage for San Francisco.
However, a Microsoft researcher notes that today 80–90% of energy consumption comes not from training, but from inference – the process of serving user requests. So the real burden isn’t just building AI models, but using them at scale.
Proliferating data centers, hidden infrastructure
Roughly 3,000 data centers operate in the United States, most run by tech giants like Amazon and Microsoft, though smaller AI startups also utilize them. These facilities are packed with servers and cooling systems, and increasingly optimized for AI tasks.
Yet, where these centers are located, how much energy they use, and the sources of that electricity often remain undisclosed. This lack of transparency hampers the ability of utility companies and regulators to plan infrastructure – potentially leading to higher energy costs for consumers and businesses alike.
AI demands new rules and oversight
While data center electricity use remained nearly flat between 2005 and 2017 due to efficiency gains, the AI boom changed everything. By 2023, energy consumption by data centers had doubled, accounting for 4.4% of all U.S. electricity usage. The carbon intensity of this electricity is 48% higher than the national average.
According to the Lawrence Berkeley National Laboratory, by 2028, more than half of data center energy usage will be AI-related – equivalent to 22% of the electricity used by all U.S. households.
What comes next? Transparency and regulation are key
AI proliferation cannot – and perhaps should not – be stopped. But how we build the infrastructure to support its energy needs is still up for debate. Where will data centers be located? What energy sources will they rely on? Who will pay the bill?
Currently, regulation and transparency lag behind technology. Futurist Árpád Rab warns that AI’s energy hunger is one of the greatest but least visible threats. While the benefits of the technology are clear, its environmental and economic costs require much closer scrutiny.
Related news
UK Retailers Prioritise AI Over Planned Technology Investments, Study Finds
Over two thirds (67%) of retailers in the UK are…
Read more >Amazon’s Corporate Workforce May Shrink As AI Takes Over Routine Tasks
Rollout of generative AI and agents will reduce Amazon‘s total corporate…
Read more >Carrefour Opens Third Autonomous Store In Belgium
Carrefour Belgium has opened its third autonomous BuyBye store in…
Read more >Related news
Promotions, prices, alternatives – promotions and Hungarian households
Tünde Turcsán, managing director of YouGov spoke about how Hungarian…
Read more >The “Pass it Back, Brother!” spring 2025 campaign has ended successfully!
This year marks the seventh year of the ‘Pass Back,…
Read more >K&H: investors will receive a missing compass
The K&H Securities – Investor Sentiment Index has been launched,…
Read more >