The Dark Side of AI:Gemini, Google’s new family of large language models, powers the conversational text featured throughout AI Overviews. These systems are powerful, but they are not perfect. In the weeks since the feature’s debut, it has recommended gluing cheese to pizza, for instance, and changing a car’s blinker fluid. Input to avoid AI detection but these early flubs, which Google chalked up to misinterpretations or a lack of data, have so far overshadowed another possible drawback: the financial and environmental cost of AI search. Embedding generative AI in such a popular and widely utilized application is sure to exacerbate the tech sector’s growing appetite for fossil fuels and water.
The Dark Side of AI
Even those who’ve been building the tech have raised alarms about what damage the unchecked rollout of the technology could wreak on societies and economies across the world. Dario Amodei, CEO of Anthropic and former vice president of research at OpenAI, told CNN’s Anderson Cooper that AI could eliminate up to 50 percent of jobs many held by college-educated workers within the next five years. Geoffrey Hinton, often referred to as the grandfather of AI, echoed this concern on the Diary of a CEO podcast, warning that AI will reduce the demand for labor by making the remaining workers more productive
Gemini not always reliable in responding to prompts: Google after chatbot’s response on PM
ChatGPT’s environmental footprint
A lesser known but equally alarming worry about AI is the environmental toll it takes. Paul Hoffman, in a writing for Best Brockers, estimated that ChatGPT alone consumed around 1.059 TWh (terawatt-hours) annually, a greater energy demand than countries like Guyana, Rwanda, and Barbados. Hoffman helpfully puts this figure into perspective: it’s enough to power every electric car in the U.S. (about 3.3 million in 2023) four times over, or to power more than 100,000 American homes for a year.
AI startup OpenAI made waves recently when it announced that its revolutionary chatbot, ChatGPT, hit 300 million weekly active users, doubling the number counted last September 2023. ChatGPT’s incredible rate of adoption has made it one of the most fast-growing, popular apps in history. In addition to providing detailed answers to fundamental questions, this amazing new tool can write essays in seconds, have human-like conversations, create images, complete intricate math equations, translate foreign language documents, read and summarize lengthy articles and even write computer code. As ChatGPT eats through all this energy, it’s apparently doing so at almost an order of magnitude more energy use than a typical Google search.
With this in mind, the team at BestBrokers decided to put into perspective the considerable amount of electric power OpenAI’s chatbot uses to respond to prompts on a yearly basis. We calculated what that would cost at the average U.S. commercial electricity price of $0.132 per kWh as of October 2024 (most recent rates published by the U.S. Energy Information Administration). After calculating the numbers, it came out that just for answering questions, ChatGPT uses an average of about 1.059 billion kilowatt-hours each year — at a cost of about $139.7 million.
Electric Power Research Institute Insights
When you send a prompt to an AI model like ChatGPT that has already been trained, the act of using the model is called inference. This means inputting your question through the model’s hundreds of billions of parameters, which enables the model to identify a pattern and predict the most relevant response based on all the data it has ingested. Estimating the energy needed for that process is a little difficult though, as that is highly variable, including factors like query length, number of users, and how well-optimized the model runs. In nearly all instances, this data is shrouded in secrecy. We can at least do some back-of-the-envelope math to get a glimpse of the scale in terms of additional electricity that will be required.
Supposedly for each question you pose to ChatGPT it consumes 0.0029 kilowatt-hours of power. This is nearly ten times more than the energy needed for a typical Google search, which consumes about 0.0003 kilowatt-hours per query, according to The Electric Power Research Institute (EPRI).
On December 4, 2024, OpenAI’s official account on X (formerly Twitter) announced that ChatGPT had reached 300 million weekly active users, a user base that was reportedly only half that size when CEO Sam Altman told Congress ChatGPT was at 150 million in September 2023. These 300 million users are creating 1 billion queries a day on their site. To answer these queries and produce those responses, our chatbot consumes about 2.9 million kilowatt-hours of energy each day. For reference, the typical U.S. household spends the daily equivalent of about 29 kilowatt-hours of electricity. That is equivalent to ChatGPT using 100 thousand times the energy used by an average home.
Fresh numbers shared by @sama earlier today:
300M weekly active ChatGPT users
1B user messages sent on ChatGPT every day
1.3M devs have built on OpenAI in the US
— OpenAI Newsroom (@OpenAINewsroom) December 4, 2024
While Google’s AI answers may initially cause a big jump in the search engine’s energy costs, the costs may begin to decrease again as engineers figure out how to make the system more efficient, says Rama Ramakrishnan, a professor at the MIT Sloan School of Management with expertise in the tech industry.
It’s hard to tell what this implies for the expense of generative AI vs. the traditional search experience, Ramakrishnan notes. My prediction is that it’s likely to increase, but it’s likely not to increase tremendously.
Whether or not this added expense is justified will go largely hinge on how much value users is able to complete in the improved AI answers. Links shown by AI Overviews receive significantly more clicks than those in standard search results, and users eventually spend additional minutes on those sites. This is a sign that users are happy with the search result.
The Training Costs of ChatGPT
While serving LLMs for inference requires extreme energy consumption, the process of training them is even more intensive. In this stage, the AI model “trains” by studying tons of data and instances. For some models, this process can range from a few minutes to a few months based on the amount of data and complexity of the model. During the training, the CPUs and GPUs the electronic chips specifically created to chew through enormous datasets run around the clock, devouring points of energy.
Power isn’t the only resource AI guzzles water plays a big role too. Training GPT-3 reportedly used 700,000 L of freshwater enough for about 320 Tesla cars and inference uses roughly 0.5 L per 20–50 answers. Daily inference across ChatGPT could consume 20–30 million liters enough drinking water for millions .
As an example, training OpenAI’s GPT-3, which has 175 billion parameters, took approximately 34 days and consumed an estimated 1,287,000 kilowatt-hours of electricity. With every new addition to models, as they grow more complex, their energy requirements grow too. Given its over 1 trillion parameters, training the GPT-4 model consumed an estimated 62,318,800 kilowatt-hours of electricity during the 100 days of training, 48 times more than GPT-3.
In truth, the o1 model’s unique training approach is what really distinguishes it from past models. It was created through a combination of a new deep neural network optimisation algorithm and a proprietary dataset built for the model. Different from previous GPT models that only learn to copy patterns in their training data, o1 was trained to figure things out independently through reinforcement learning, a method that instructs the system by rewarding and punishing it. Next, it applies a “chain of thought” to incoming queries, breaking them down like humans do when they approach complex challenges step-by-step. By taking this approach, o1 is able to iterate on its decision-making process, test new strategies and adapt based on successes or failures.
Calculating Potential OpenAI Profits from Paid Subscriptions
At a rough estimate at least $140 million annually, the user queries OpenAI currently pays for aren’t much compared to paid subscriptions OpenAI expects to bring in, or so they hope. In September, OpenAI COO Brad Lightcap announced that ChatGPT had surpassed 11 million paid subscribers. About a million of those were business oriented plans, i.e. enterprise and team users and the rest about 10 million of those people were paying for the plus plan.
Then on December 5, a brand new $200/month Pro plan was released, providing users unlimited access to the o1 and o1-mini models, GPT-4o, and Advanced Voice. It featured the pro o1 pro mode model, plus unlimited access to the Sora AI video generator and greater API call rate limits. Though the $200 monthly plan sub price sounds outrageous for anyone other than power users, this deal was actually likely a steal for software developers or users churning out videos with the (mostly) limitless AI undertaking features included with the Pro plan.
Using historical data (1 million Business subs, 75% of which are Team users as of Sept 2024, then add 10 million Plus subs), we calculate an average of 11.11 million paying subs by January 2025. Taking into account the high cost of the Pro plan and the fact that it was introduced last December, we estimate there are about 10 thousand Pro users bringing in $2 million in revenue for the company.
ChatGPT is the flagship application, it’s now proven that these generative AI technologies are starting to make an enormous carbon footprint, one that’s already being felt and measured.
The Scientific American reportedearlier this month that the total energy demand of AI servers represents around 1.5 percent of global energy use. The researchers predict this figure will double by the end of next year, as more people and companies start to adopt AI in larger volumes. “Generative AI in particular is projected to use 10 times the energy in 2026 compared to 2023,” cautioned the research teams, alluding to the fact that companies such as Google that have heavily invested in the generations of the technology are already investing heavily in physical energy infrastructure to guarantee they’re able to accommodate these increasing power demands.












