In this new age of artificial intelligence, large language models are now more than ever integrated into the daily lives of humans. The ground-breaking technology is a quick query away from answering any curious question and written task. As of 2025, one of the major large language models, ChatGPT, receives over 1 billion queries per day and averages 123.5 million daily active users. The popularity of artificial intelligence and chatbots arises from its allure to produce helpful information and complex written content specifically catered to the user’s request. However, it gets too easy to forget the cost of new innovations and products. Although LLMs are an exciting piece of innovation that have vastly transformed the landscape of information technology, the environmental impacts of AI are often ignored and swept under the rug. How does the generative AI industry impact the natural environment, and what does this mean for climate change?Â

NATHAN VILLASENOR / DAILY NEXUS
The revolution of generative AI emerged early in 2023 with the launch of ChatGPT and Midjourney, as advanced LLM models were trained on immensely large datasets in order to produce human-like responses and images. LLMs are a form of generative AI because they can create new combinations of text based on trained examples, information and patterns. Considering the predictive abilities of generative AI, models like ChatGPT require large amounts of data training that easily entail billions of words sourced from the internet, books, articles, specific data sets and social media. In order to identify patterns and relationships in the data, parameters are set to curate the most accurate and relevant results.Â
According to the Columbia Climate School, each successive generation of an LLM accumulates more parameters for a higher quality response. In 2019, GPT-2 had 1.5 billion parameters, and the next version, GPT-3, had 175 billion parameters. To process the great amount of data, LLMs require tens of thousands of high-performance chips for training, making predictions and responding to queries. Graphic processing units (GPUs) are typically used because of their ability to carry out multiple calculations or processes, but they also consume more energy than any other chip type. AI usually takes place in a cloud, which can take the form of servers or databases that store and process data. The cloud is accessible through the internet via remote data centers. The cloud stores all of the data used to train the LLMs and they also provide the platform for which trained AI models can be launched.
But for the functionality of AI, there are environmental costs that are overlooked due to the excitement of innovative technology. According to MIT News, the billions of parameters these LLMs require for data training involves a great deal of electricity, which contributes to increasing carbon emissions and pressure on electric grids. Especially with the wide scale usage and constant initiative to fine-tune the model, large amounts of energy are used and consumed for the sake of generative AI.
In addition to the training demands of electricity, data centers are one of the major contributors to AI’s environmental impact. Data centers are temperature-controlled buildings that house computing infrastructure that support cloud computing services. Since generative AI has been on the rise, more data centers are being constructed and more energy is being demanded from each center. The electric consumption of data centers have risen high enough where, were it to be a country, it would be the 11th largest electricity consumer in the world as of 2022, between Saudi Arabia and France, according to the Organisation for Economic Co-operation and Development. To meet the pace of data center construction, fossil-fuel power plants are sourcing the electric density demanded by AI. Another issue that pertains to generative AI is its fluctuating energy use during phases of the training process. In order for power grid operators to absorb these fluctuations and protect the electric grid, diesel-based generators are commonly used to mitigate damage.
Outside of training the AI model, there are still environmental consequences linked to the predictions generated using data from the internet. A ChatGPT query alone consumes about five times more electricity than a simple web query. Lacking awareness of the environmental costs of generative AI, users are not inclined to decrease query activity. As generative AI becomes more integrated into daily applications and software, inferences will come with a higher electrical cost, especially as models become more advanced and larger in scale.
Beyond the electrical demands of generative AI, water consumption is another critical environmental issue. Water is used to cool data centers by absorbing the heat from the computing equipment. Due to the great deal of water required for the electricity consumption, there is an increasing strain on municipalities and local ecosystems. The hardware itself also has indirect environmental costs when looking at the manufacturing of such high-performance computing hardware and the involved transportation.
Despite the fact that generative AI has detrimental impacts on the environment, there are also areas where it can combat climate change. Based on a study done at UC Irvine, AI systems were found to emit between 130 and 1500 times less CO2e per page of text generated by human writers. The study generally concludes that the use of AI has the potential to carry out various activities with a lower carbon footprint than human beings. In addition to its comparative carbon emissions, AI-driven technology can be harnessed to improve data modeling and predicting climate change patterns, which are pivotal for drafting effective adaptive and mitigation strategies. Although generative AI’s carbon footprint could be considered smaller than traditional human methodologies and can be leveraged for climate change initiatives, the present industry is nonetheless unconscious of the environmental impacts that still exist.Â
Although the generative AI industry is not on a sustainable path, there is still an opportunity to develop the technology in a way that supports the environment. To transform generative AI into a sustainable system, it will require an interdisciplinary approach to addressing AI’s cost on society.Â
So how can generative AI be more environmentally conscious? To address the concerns of sustainability, there needs to be an accurate measurement of how much electricity the computers are consuming and how it relates to carbon emissions. Transparency and data standardization is critical for identifying and comparing various systems for the best solution. Another initiative is to switch to renewable and carbon-free energy. Google data centers already source 100% of their energy from renewable sources and Microsoft foresees all cloud data centers, regardless of ownership, achieving the goal of 100% carbon-free energy use by 2030. Other solutions include improving the management of computers, designing more efficient hardware, finding alternative cooling methods and adjusting models and algorithms. To galvanize the movement toward sustainable AI, the government is key to implementing regulations and incentives on carbon emissions transparency and sustainability. While the industry of AI is not sustainable and carries a heavy carbon footprint, conscious leadership and management can reconfigure AI-driven technologies as a long-term solution for climate change.Â
A version of this article appeared on p.11 of the Jan. 30, 2025 edition of the Daily Nexus.