Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Tom's Guide

    ChatGPT energy emergency — here's how much electricity OpenAI and others are sucking up per week

    By Scott Younker,

    15 days ago

    https://img.particlenews.com/image.php?url=07ZoMK_0vdJTimT00

    A pair of reports this week highlight the growing environmental costs of using generative AI and training LLMs. And the data is shocking — showing a single query on ChatGPT-4 can use up to 3 bottles of water, that a year of queries uses enough electricity to power over nine houses, and that this sustainability problem could get 10x worse by 2030.

    Much of the cost comes due to growing demand for data centers that power AI models. Once the centers are built or already exist, a new set of issues crop up surrounding their internal and external affects.

    This week, data from the University of California, shared with the Washington Post and spotted by our friends at Tom's Hardware , showed that AI data centers require very heavy water consumption to cool the servers.

    The paper does note that water usage depends on the state and proximity to water. Lower water usage corresponds to cheaper electricity and higher use. Texas, for example uses an estimated 235 mililiters, about one cup, to generate a 100-word email, while the state of Washington requires 1,408 mililiters for the same email which equates to three 16 oz water bottles.

    That may not be a lot but ChatGPT isn't being used once for one 100 word email. Plus, the OpenAI-built chatbot takes a lot of power to run, and again, this is just for plain text. The report does not appear to go into detail about image or video generation.

    A regular Google search uses around 0.3 Wh of electricity per query. Meanwhile, ChatGPT requires about 2.9 Wh.

    As an example from an EPRI report , a regular Google search uses around 0.3 Wh of electricity per query. Meanwhile, ChatGPT requires about 2.9 Wh.

    That means if one out of 10 working Americans use GPT-4 once a week for a year (so, 52 queries total by 17 million people), the corresponding power demands of 121,517 MWh would be equal to the electricity consumed by every single household in Washington D.C. (an estimated 671,803 people) for twenty days. On top of that, 105 million gallons of water would be used just to keep the servers cool.

    Data center consumption

    https://img.particlenews.com/image.php?url=18qk4w_0vdJTimT00

    (Image credit: Getty Images)

    The second major report to come out earlier this week just reinforces the issue. Bloomberg reported the drive to insert AI into every conceivable facet of the tech world has driven a boom in utility companies across the United States to build new gas-fired power generation facilities to meet surging demand for power.

    Much of the demand appears to come from three sources: AI data centers, manufacturing facilities and electric vehicles. The article doesn't provide a breakdown of how much demand each of those three vectors is actually putting on our systems, but data center demand is expected to grow by up to 10x current levels by 2030 , which would be 9% of the United States' total electricity generation, according to the Electric Power Research Institute.

    Utilities across the country have announced nearly double the amount above new plants are being built to combat the demand, with a majority coming to Texas.

    “We were poised to shift away from the energy system of the past, from costly and polluting infrastructure like coal and gas plants. But now we’re going in the opposite direction,” Kendl Kobbervig, advocacy director at Clean Virginia told Bloomberg. “A lot of people are feeling whiplash.”

    As Bloomberg notes, with the recent fracking resurgence, natural gas has reduced the environmental impact of coal as we move away from that fuel source. However, gas plants have the tendency to leak methane gas which has "80 times the planet-warming impact of carbon dioxide in its first 20 years in the atmosphere."

    Additionally, once a gas plant is built, they don't go away and are likely to run for a minimum of 40 years. It also means that utilities that had committed to reducing carbon emissions have had to cut plans down. An example from Bloomberg, PacifiCorp was expected to reduce emissions by 78% by 2030. They have revised that estimate to 63% with the announcements of new plants.

    Tom's Guide AI expert Ryan Morrison said of the new reports: "While it is true that there is a significant environmental impact from AI, largely driven by the massive compute requirements, it will become less of a problem as models get more efficient and custom chips take over from power-hungry GPUs.

    "AI may well also be the solution to its own energy problem, as well as the wider energy problems facing society. As the technology improves it will be used to design more efficient compute and cooling systems that minimise the impact on the environment. The designing the path to net zero may require AI."

    The real cost of AI

    https://img.particlenews.com/image.php?url=2FSOxN_0vdJTimT00

    (Image credit: Shutterstock)

    AI is power-hungry and these new data centers aren't just impacting the environment and climate change. The Los Angeles Times reported in August that in Santa Clara, data centers consume 60% of the city's electricity. This appetite runs the risk of increasing blackouts due to lack of power. From there its also raising the water and power bills of people who live in those communities. Companies like PG&E say that customers' bills won't raise but it's clear that they're passing infrastructure cost on to customers, which data centers are clearly not paying their fair share.

    It seems to fly in the face of claims by major AI companies like OpenAI, Google, Meta and Microsoft saying that they are committed to reducing their environmental impact. A Microsoft rep told the Post, that the company is "working toward data center cooling methods that will eliminate water consumption completely."

    AI has become the pie in the sky for many of these companies and its taking precedence over all else, including their impacts on their communities and the environment.

    More from Tom's Guide

    Expand All
    Comments /
    Add a Comment
    YOU MAY ALSO LIKE
    Local News newsLocal News

    Comments / 0