AI search, such as ChatGPT, requires significantly more energy because it involves complex calculations to generate responses rather than simply retrieving links. A single ChatGPT query uses about 10 times more energy than a traditional search engine query.
Data centers consume vast amounts of electricity and water, often sourced from local supplies, which strains resources in areas where water is already scarce. They also generate noise pollution, causing health issues like recurring headaches and sleep disturbances for nearby residents.
AI tools, especially generative AI, require substantial electricity to operate. As their use grows, they compete with other energy demands, such as air conditioning during heatwaves, increasing the risk of blackouts and straining already limited electricity grids.
Tech companies are considering investments in nuclear power, including mini reactors and fusion energy. However, these solutions are not expected to significantly increase output until 2030 or later, while AI's energy demands are projected to double by 2026.
Carbon offsets, which involve funding projects to reduce emissions elsewhere, are not effective enough to counterbalance the rising emissions from AI integration. Google acknowledges that its carbon emissions have increased by 48% since 2019, making it challenging to meet its 2030 carbon neutrality goal.
AI amplifies issues like misinformation by enabling bad actors to generate false content at scale. It also worsens the environmental impact of data centers by increasing their energy and water consumption, further straining local resources and communities.
If Google switched all its search queries to AI-based searches, it would consume an amount of electricity equivalent to Ireland's total energy consumption, roughly double what Google used in 2020.
Today, the cost of our AI obsession. Though there's talk that AI might hold the answers to the climate crisis, it's also part of the problem. One that's only going to get bigger as the technology advances and its use grows. How every answer from a friendly artificial intelligence assistant is straining the environment. I'm Malika Bilal, and this is The Take.
How much do you play around with AI? Maybe you use it to write essays or have celebrities do music covers of songs they've never sung.
For one reporter who's steeped in the tech industry, the answer is as little as possible. If you're talking about AI the way it's been used in the past couple years, which is generative AI tools like writing tools, tools to create images, I don't really use those beyond just sort of playing around to see what they'll spit out. I'm Sophie Bushwick, senior news editor at New Scientist, and I'm based in New York City.
Sophie covers AI and the harms it can cause, but she says she's unable to fully opt out of using it. Because like most people, I do spend too much time doom scrolling on social media.
So AI is such a broad term that a lot of things fall under that umbrella. So when I'm scrolling through a social media feed, there is an algorithm behind the scenes determining which posts will pop up. It's trying to choose the post that will make me engage the most, that will snag my attention. And that algorithm uses machine learning. So that is technically, in a lot of ways, an AI algorithm.
All of that takes a lot of data. And it's the generative AI, tools to write or create images, that's getting a lot of attention. From making the technology to training it, creating a product like ChatGPT requires feeding it a lot of texts and images, so it knows how to respond when we give it a prompt. ♪
So, you know, that trained on a whole internet's worth of writing and data. And so that process and the process of not just feeding that data into it, but getting the architecture right, all of that is just very energy intensive that uses up a lot of juice.
And once you have the model, then using it every time you send a query to it, it's got to do those calculations to spit out its answer. And that also uses electricity. So one estimate suggested that, you know, having a chat GPT query would use about 10 times as much energy as typing that query into a search engine that just returns a list of links.
That data also has to be stored somewhere. That's where the cloud comes in. To some of us, it might seem like this amorphous place somewhere far away. We hear a lot about the cloud, you know, things basically that means the data and calculations don't have to be stored on your personal device where they can make it overheat and use up your memory. Instead, you connect online to a remote location where the calculations are done and then sent back to your phone.
And if you can't really visualize the cloud, you wouldn't be the only one. The internet has been misunderstood for a long time. And the cloud is just the latest example. The internet is not something that you just dump something on. It's not a big truck. It's a series of tubes.
One politician even assumes the cloud is actually in the sky. Have you ever seen Google software CDs being sold? No? All of it is in the cloud. Until now, no one has studied that whether during rain or during a storm, will there be aberrations in it?
But where the cloud is really based is in remote locations across the world. And those remote locations are data centers. So there's thousands of data centers all around the world, but about a third of them are in the U.S. Data centers, from the outside, they might just look like a big building. Maybe you'll see some, you know, fans more than you might expect. But it looks pretty standard from the outside. The thing is that they do generate noise.
Jennifer Garing has lived here for eight years and she says the noise that comes from the Cyrus One data center on Dobson is beyond annoying. She says it's worst in the early mornings and in the evenings. Sometimes the constant overbearing sound makes it hard for her family to sleep.
They generate both the cooling system and the calculation systems. All of these make noise. And if they're not insulated well enough, that leaks out into the surroundings. So a lot of data centers are deliberately built in more rural areas where there's not as many people around. But there are often still people within the radius of influence. And they've reported some very weird health problems.
So things like recurring headaches, long-term difficulty sleeping. One of the theories for what's happening here is that there's just this constant low-level sound happening. And even when you're not consciously aware of it, your body is. And it's producing a stress response to this noise.
And that can be causing all sorts of issues down the road. So there's been some controversy in places where they've said that, you know, these data centers making too much noise. It needs to be limited by the noise ordinance of the area. Or if the area doesn't have one, there's people who are talking about passing laws. And at the same time, the people who have an investment in these data centers don't necessarily want to have these restrictions. So they're also pushing for exemptions from things like noise laws.
But it's not just the noise. AI also takes up resources. The remoteness of most data centers means water is often already scarce. So the cloud only further strains the water source. A lot of the water that is being used to cool data centers is coming from the local water supply. So it's water that would otherwise be going to people.
So there's different sources you can have for water. You know, some companies that know they use a lot of water, like a company that makes soda, they might be sourcing their water from places where it won't, they're not going to be competing with humans.
So they might have a source of water that's less disruptive for the local community. And in the case of a data center, it's using water that would otherwise be used by people. And often these data centers might be in like hot, dry areas, you know, places where they're not going to be a ton of people. And those are places where water is a scarce resource and they are definitely drawing on it.
Another resource drain is electricity. And the type of electricity that a data center uses depends on what type of power source the community uses. If it's in an area that uses burning fossil fuels to supply its power, that's going to be the source of the power for that. If it's in an area where there's a lot of renewable energy in the grid, it'll be running on renewable energy. But either way, it's drawing on electricity. And we do have limits. The grid does have limits for
for the amount of demand it's able to support right now. As temperatures heat up, more folks are cranking up the air conditioning, straining America's electrical grid. A new report from a major energy regulator found two-thirds of North America face an elevated risk of blackouts over the next few months. And now Congress is stepping in, holding a hearing Thursday to determine what's causing the shortfall and how to fix it.
But at the same time, now that AI is also competing, that's another thing that's going to be pulling electricity from the grid. And it could create conflicts in places where it's requiring too much energy and the people who live locally might have to have restrictions as a result. And that's big because data centers are a big drain on electricity.
Every time someone uses AI, it uses more energy than doing an equivalent task that doesn't have AI baked into it. So, you know, if you're using Google's generative search experience, that's going to be more energy than just using Google's regular old search with links. So I think that as more companies incorporate these into more products, they'll be used more and AI's energy requirements are going to go up. After the break, how tech companies are responding.
When you're part of a military family, you understand sacrifice and support. So at American Public University, we honor your dedication by extending our military tuition savings to your extended family. Parents, spouses, legal partners, siblings, and dependents all qualify for APU's preferred military rate of just $250 per credit hour for undergraduate and master's level programs. American Public University, value for the whole family. Learn more at apu.apus.edu slash military.
Get your news in less than three minutes, three times per day with the Al Jazeera news updates. Just ask your home device to play the news by Al Jazeera or subscribe wherever you listen to podcasts. Tech companies know that there's a looming energy problem on the horizon. And journalist Sophie Bushwick says those companies have been feeling the pressure to find a solution.
So big tech companies are aware of the increasing electricity needs of these tools. One thing some of them have suggested is investing in nuclear power. And you see that from, you know, from tech moguls that they're interested in startups to make mini reactors or to really push for fusion energy.
But the problem is, is nuclear power isn't going to really be increasing its output until maybe 2030. And then one expert said they didn't think it was even going to be making anything. It was still going to be pretty minor until 2040. And we've already heard that, you know, data centers are going to be doubling their electricity demand by 2026.
So we don't have time to wait for nuclear to come swoop in and save all of our grids because it's just not going to be able to ramp up production in time to meet these needs. Another response has been the claim that AI will be part of environmental solutions and not just part of the problem.
But the problem is, is AI tools are also being used to help the fossil fuel industry, like find and harvest more fossil fuels, which can then be burned. And because of its immediate demands now, it's causing, you know, plants that might
might have been slated for retirement are being actually kept running in order to meet this demand and maybe in some cases even ramping up their energy production and burning more fossil fuels. So just I think the argument that AI will ultimately solve everything is also not that realistic and
If that prediction does come true, it's not going to be for a little bit down the road. Because, you know, I mean, even today we have things we know that will help against climate change. The issue is not in finding new science. The issue is in implementing policies that will encourage people to take these scientifically backed steps. As for the solutions that aren't so far off down the road, those seem to be hitting roadblocks.
Google acknowledges that reaching its zero emissions goal by 2030 will be challenging. It admits that total emissions are going to rise before they drop and that key solutions don't currently exist and will depend heavily on the broader clean energy transition. Google released an 86-page report, which states that its carbon emissions have raised by 48% since 2019.
It says as Google further integrates AI into its products, quote, reducing emissions will be challenging.
So a lot of this has to do with increasing evidence about a thing called carbon offsets. So one popular way that a company can reduce its carbon footprint is instead of making its own operations more efficient, it can pay someone else who might have a project like it can pay places that have projects that pull carbon dioxide out of the air.
Google's basically saying, look, we're acknowledging that carbon offsets are not sufficient. Carbon offsets are not actually working for us. And so we're going to have to make some changes in order to reach that carbon neutral goal where carbon neutral is, you know, the amount of emissions you're putting out. You're also reducing the emissions. You're reducing the amount of
greenhouse gases in the atmosphere by the same amount, roughly. So they've now said that they're aiming for carbon neutrality by 2030. I will see how that works out and how whether they reach that goal as we have this increasing use of AI tools.
One researcher did a kind of thought experiment and did some estimates about how much energy Google would require if it switched from all of its search queries, made them all AI searches. And it turns out that would end up using essentially an Ireland's worth of electricity. So the entire electricity consumption of Ireland would be going into it. And that's about double of what Google required in terms of energy back in 2020.
Of course, this is just, you know, theoretical. Right now, Google is not entirely switching to AI. Sophie says environmental problems in the tech industry didn't start when AI did. And it won't end even if we all stop using artificial intelligence. AI, it's not that it creates problems so much as that it highlights existing problems.
And shows how it has the potential to make them worse. So one example of this is just the problem of misinformation and disinformation online. We know that there's bad actors who, when there's a breaking news event, rush to, you know, put rumors and made-up stories and screenshots from video games that they claim are actually real images of on-the-ground events.
And AI didn't cause that problem. But someone who is a bad actor can use AI to churn out this type of bad information at a really high volume with great ease more quickly than they could before.
AI can make an existing problem worse. The same thing goes for data centers. Data centers were already a big drain on electricity and water, and they were already, you know, causing some issues for the communities where they're located before they started adding AI to the mix. But the issue is that AI could make these problems worse because it requires more energy to make those calculations.
I don't think you should feel guilty if you like to use ChatGPT or if you like to use one of those chat apps. So I don't think guilt is super helpful. But it also, again, sort of like the issue of how is AI being used? How are you using AI? Like, I've heard about people using ChatGPT to generate messages that they can send to their local representative.
So if there's a political cause you feel strongly about, like reducing the impact of climate change, an AI tool that helps you push your representative to take action on that could end up in the end reducing the overall amount of carbon, even if you are using carbon to get it out there. So I do think it depends how you're using it. Just creating these products in the first place is
And that's The Take.
This episode was produced by Chloe K. Lee, with Manahil Naveed, Mohamed Zain Shafiqan, Doha Mossad, and me, Malika Bilal. It was edited by Alexandra Locke. Our sound designer is Alex Roldan. Alexandra Locke is the Take's executive producer, and Ney Alvarez is Al Jazeera's head of audio. We'll be back tomorrow.
When you're part of a military family, you understand sacrifice and support. So at American Public University, we honor your dedication by extending our military tuition savings to your extended family. Parents, spouses, legal partners, siblings, and dependents all qualify for APU's preferred military rate of just $250 per credit hour for undergraduate and master's level programs. American Public University. Value for the whole family. Learn more at apu.apus.edu slash military.