“As an Artificial Intelligence (AI) researcher, I often worry about the energy costs of building artificial intelligence models,” Kate Saenko, associate professor of computer science at Boston University, wrote in an post at The Conversation. “The more powerful the Artificial Intelligence (AI), the more energy it takes.”
Although while the energy consumption of blockchains teck like Bitcoin (BTC) and Ethereum (ETH) has been studied and debated from Twitter to the halls of Congress, the effect of the rapid development of Artificial Intelligence (AI) on the planet has not is still received the same spotlight.
Professor Saenko intends to change that, but acknowledged in the post that there is limited data on the carbon footprint of a single generative Artificial Intelligence (AI) query. Nonetheless, she stated that research puts the number four to 5 times higher than a simple search engine query.
Reports by a 2019 report, Saenko stated a generative Artificial Intelligence (AI) model was known the Bidirectional Encoder Representations from Transformers (or BERT)—with 110 Million parameters—consumed the energy of a round-trip transcontinental flight for one individual using graphics processing units (or GPUs) to train the model.
In Artificial Intelligence (AI) models, parameters are variables learned from data that guide the model’s predictions. More parameters in the mix often means greater model complexity, requiring more data and computing power in doing so. Parameters are adjusted during training to minimize errors.
Saenko noted in comparison that OpenAI’s GPT-3 model—with 175 Billion parameters—consumed an equivalent amount of energy as 123 gasoline-powered passenger vehicles driven for one year, or around 1,287-megawatt hours of electricity. It likewise generated 552 tons of carbon dioxide. She also mentioned that the number comes from just getting the model ready to launch before any consumers started using it.
“If chatbots become as trending as search engines, the energy costs of deploying the AIs could really add up,” Saenko stated, citing Microsoft’s addition of ChatGPT to its Bing web browser earlier this month.
Not helping matters is that increasingly Artificial Intelligence (AI) chatbots, like Perplexity Artificial Intelligence (AI) and OpenAI’s wildly trending ChatGPT, are releasing mobile applications. That makes them even easier to use and exposes them to a much broader audience.
Saenko highlighted a study by Google that found that using a more efficient model architecture and processor and a greener data center can considerably reduce the carbon footprint.
” Although while a single large Artificial Intelligence (AI) model is not going to ruin the environment,” Saenko wrote, “if a thousand corporations develop slightly different Artificial Intelligence (AI) bots for different objectives, each used by millions of customers, then the energy use could become an issue.”
In the end, Saenko concluded that more research is required to make generative Artificial Intelligence (AI) more efficient—but she’s optimistic.
“The good news is the fact that Artificial Intelligence (AI) can run on renewable energy,” she wrote. “By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when energy from renewable sources is more available, emissions can be reduced by a factor of 30 to 40 compared to using a grid dominated by fossil fuels.”
Interested in learning more about AI? Check out our latest Decrypt U course, “Getting Started with AI.” It covers everything from the history of Artificial Intelligence (AI) to machine learning, ChatGPT, and ChainGPT. Find out more here.