Artificial Intelligence has recently become something of an environmental dilemma. Its rise in popularity has led some to question whether their A.I. use is harming the climate.
Sophomore Lisa Yeley said that while she uses A.I., she is aware of its water consumption and tries to limit excess searches. Junior Annika Smith said she had never heard about A.I.’s environmental effects, and that she continues to use it to explain confusing topics.
A.I. programs, including large language models (LLMs) like ChatGPT, do not just exist virtually. They rely on data centers to store physical hardware, which runs applications and algorithms. Data centers are constantly moving cool water through the building to keep the servers from overheating. This can put a strain on ecosystems, especially in locations where water is scarce.
Amanda Figolah, AP Environmental Science teacher at Bloomington South, said the effects of data centers on the environment are “a gigantic setback for any progress being made on energy conservation or offsetting carbon emissions with renewable energy.”
One of A.I.’s most resource intensive functions is training LLMs. The World Economic Forum states that training OpenA.I.’s ChatGPT-3 consumed just under 1,300 megawatt hours of electricity, which can be compared to the annual power consumption of 130 US households. Each time the model releases a new advancement, the energy demand increases. ChatGPT recently released its fifth version.
This extreme energy use puts stress on the electrical grid, which is the network of power plants, transmission lines, and distribution centers that brings electricity to homes and businesses.
In 2023, data centers accounted for 4.4% of total US energy consumption, according to the 2024 United States Data Center Energy Usage Report. The report estimates that by 2028, this amount could be as much as 12%. This huge increase is a direct result of A.I.-driven infrastructure.
With this spike in energy use, electricity grids will increasingly rely on power plants that use fossil fuels. Proposed transitions to wind or solar power are pushed further into the future, which has detrimental implications on the global climate crisis. Even in Indiana, energy utilities have pushed back dates to begin phasing out coal according to the Citizen’s Action Coalition, an Indiana consumer advocacy group.
Citizen’s Action Coalition found that large tech companies like Google have made plans to open hyperscale data centers in Indiana, which are even larger than traditional data centers. It’s unlikely that our dependence on data centers will change anytime soon.
However, after training is complete and an LLM is in use, the amount of energy used to power searches is reduced. According to MIT News, an individual search in ChatGPT uses five times more energy than a search in Google, but this difference pales in comparison to the energy consumption of other forms of A.I., like video generating.
Essentially, as a user, searching with A.I. word processing chatbots do not have a comparable environmental effect to other A.I. functions like the initial training or algorithm processes.
Therefore, when faced with the complicated question of how to interact with A.I. in an environmentally responsible way, it is important to focus on the bigger issues while also being conscious of your own choices.
Kirstin Milks, AP Biology teacher at South, offers a way for students to continue using A.I., while understanding its impacts. She said, “I think the really important thing is that we have to think about using A.I. leanly. So when we use lean A.I., we are very specifically prompting for what we want.” In other words, asking A.I. one detailed prompt instead of many vague searches gives you the result you are looking for without consuming as much energy in the process.
In addition to individual actions, Figolah shifted the perspective more broadly. She said, “young people often have the answers that older people either fail to see or don’t have the will to implement.”