Nearly everyone is familiar with ChatGPT, an AI language model developed by OpenAI that understands text and generates responses that are eerily human-like. One of the leading apps in the Apple Store, ChatGPT saw its user base surge past 400 million weekly users in February 2025, up from 300 million in December of the previous year, highlighting the rapid adoption of artificial intelligence tools, according to Reuters. This unprecedented growth rate exceeds that of TikTok and Instagram, positioning it as one of the fastest-growing consumer applications ever made.

How does ChatGPT work?

The AI model is trained on a vast dataset of human-generated text, including books, articles, and conversational data. During this pretraining phase, it learns to predict the next word, or token, based on the context that precedes it—hence the name Chat Generative Pre-trained Transformer.

Next, the model is fine-tuned to enhance the accuracy, effectiveness, and safety of its responses. OpenAI’s mission to ensure that artificial intelligence benefits all of humanity has been cited as a key influence on its decision to create a conversational chatbot, with the goal of making AI accessible to all. The company was originally founded as a nonprofit for that reason, but later adopted a hybrid structure with a for-profit arm to raise the capital needed to further develop the technology.

Once trained, the model is deployed for real-world application. Users interact with the chatbot by providing prompts such as asking a question, making a statement, or offering any other type of input. When the model receives this new, unseen data and generates a response, it is engaging in a process known as inference.

This process is powered by data centers—facilities that house powerful computer servers. While OpenAI has not disclosed the exact number of centers supporting ChatGPT, the scale is likely substantial given user demand. According to Futurism, OpenAI spends up to $700,000 a day on these operations, though these estimates have not been independently verified.

Data centers consume large amounts of energy to power and cool servers. These servers rely on electricity to process data, store information, and manage user requests. As servers generate heat during operation, cooling systems such as air conditioning or liquid cooling are necessary to maintain safe temperatures and ensure efficient performance. All of this requires a substantial amount of energy, especially at scale.

The environmental impact of this energy consumption has drawn increasing attention from researchers and analysts. Steven Gonzalez Monserrate explores the ecological consequences of data centers and the scale of energy required for the power-intensive processes described above. One key takeaway he highlights is that the carbon footprint of the Cloud has been estimated to rival or exceed that of the airline industry. In fact, a single data center can consume as much electricity as 50,000 homes.

Sam Altman, CEO of OpenAI, has acknowledged the environmental implications of artificial intelligence, especially in terms of energy usage. During the World Economic Forum in January 2024, he said that future AI progress will demand significant energy resources, which could place pressure on global energy systems. He also emphasized the need for advancements in clean energy technologies, such as nuclear fusion and solar power, to support AI’s increasing energy requirements.

Most of the energy used by AI models is consumed during the training phase, which requires substantial computational power over extended periods. This process involves thousands of powerful GPUs (Graphics Processing Units) working in parallel to process large volumes of data. While inference is still energy-intensive, it demands far less computational power than training does. Research suggests that up to 90% or more of a model’s total energy consumption can come from training.

To put this into perspective, consider the smartphone in your pocket. Manufacturing a single smartphone is the most energy-intensive part of its lifecycle, requiring the mining of rare earth metals, factory production, and global shipping before it is ever used. Daily usage adds to that footprint over time, but much of the environmental impact occurs during production. AI model training follows a similar pattern. The heaviest environmental cost is concentrated in the training phase, built into the model before a single user ever types a prompt.

What it costs to run a single query

To estimate this figure, SemiAnalysis researchers began by calculating ChatGPT’s total daily hardware expense of $694,444. Next, they estimated that the chatbot processes approximately 193 million queries per day. By dividing the total daily hardware cost by the number of queries, they calculated that each query costs about 0.36 cents to process. By comparison, a Google query costs approximately 1.06 cents.

OpenAI reports that is working to improve algorithms to make AI systems more efficient by using less computer power. The company has said that since 2012, the amount of compute needed to train a neural network to perform the same on ImageNet classification has been halving every 16 months. In simple terms, these improvements allow AI systems to perform the same tasks with fewer resources.

Some analysts and researchers note that AI companies, including OpenAI, have not been fully transparent about the costs associated with developing their systems, running deep learning algorithms, and training large language models. This lack of transparency raises concerns, particularly regarding the operations and environmental impact of ChatGPT. Although Altman has pledged disclosure, critics say OpenAI’s limited disclosures continue to draw scrutiny.

Limited disclosure also makes it difficult to assess how the industry’s energy demands will evolve. Altman has said that the company is committed to sustainability, though details about how those commitments will be implemented remain unclear. Some analysts suggest the rapid growth of AI could drive increased investment in renewable energy, while others caution that rising demand may place additional strain on existing energy systems. Forecasts vary, but as AI adoption expands, so will the energy required to sustain it.

The remaining question is not just how much energy AI will require, but who will absorb the consequences.