Your AI Videos Use Way More Energy Than Chatbots. It’s a Big Problem

Generative AI requires a lot of energy. But even the enormous amount of energy required to train and work with large language models pales in comparison to what is required to run video models using tools like Viral application Sora from OpenAIwho flood our social networks with stupid fake clips.

Generative AI models in general require a lot of energy to power. The servers running your ChatGPT request use a labor-intensive process that requires a lot of electricity to maintain. AI is “biggest driver” of electricity consumption one report found in North America. And this may be reflected in your electricity bill, Artificial Intelligence Data Centers occurring throughout the United States, increase in electricity bills households nearby. By some estimates, one AI query uses 10 times more energy than a simple Google search.

While major AI companies remain hesitant to detail exactly how much it takes to train and run AI models, the field of research is growing in search of answers. Sasha Luccioni, head of artificial intelligence and climate at Hugging Face, one of the most popular artificial intelligence platforms and research centers, is a leading researcher studying the energy needs of artificial intelligence. In the new study, Luccioni and her team examined several open source Video models with artificial intelligence. (Popular video tools such as Sora and Google I see 3 were not included in the study because they are not open source.)

The team used the open-source Hugging Face codebase and created AI-powered videos with different models. They measured the amount of electricity required to create these clips as various factors changed, including longer video length, higher resolution, and higher quality (this is achieved through a process called noise reduction). They ran the test using the Nvidia H100 SXM GPU, a powerful computer chip that can be used in artificial intelligence data centers.

“Creating video is definitely a more labor-intensive task: instead of words, you generate pixels, and several frames per second ensure the video flows smoothly,” Luccioni said in an email. “It's complicated”.

Take an AI video that is 10 seconds long and runs at 240 frames per second. “That’s 240 images that the AI ​​has to generate,” Luccioni explains. This is especially true for multi-dimensional content: “It really increases processing power and energy,” she said.

AI video energy use

The study found that distributing videos is 30 times more energy-efficient than generating images and 2,000 times more expensive than generating text. It takes about 90 Wh to generate a single video using AI, compared to the 2.9 Wh needed to generate images and 0.047 Wh to generate text.

To put these numbers into context, the average energy efficient LED bulb uses between 8 and 10 watts. LCD TVs can be used between 50-200 Wwith new technologies such as OLED displays helping to manage them more effectively. For example, the 65-inch Samsung S95F, chosen by CNET for best image quality 2025typically consumes 146 watts, according to Samsung. So making one AI video would be equivalent to watching that TV for 37 minutes.

The energy demands of generative AI, especially for video, are significant. This sets the stage for a huge problem as AI becomes more widely used.

Check this out: The Hidden Impact of the AI ​​Data Center Boom

AI's growing energy needs

Generative video is experiencing something of a breakthrough. This is largely due to Google and ChatGPT manufacturer OpenAI. Veo 3 and Sora, the company's artificial intelligence video models respectively, were released to great fanfare and have since gone viral. The Sora app had over a million downloads five days after launch and Google reported that Gemini users had made over 40 million videos in the first few months after debut.

As the use of artificial intelligence grows, the US electric grid may not be ready to meet future demand. This is why AI companies and the US government are advocating for billions of dollars of investment in AI infrastructure. NVIDIA recently announced that it investing $100 billion in OpenAI build AI-powered data centers that will produce 10 gigawatts of power on Nvidia systems over the next few years. Microsoft and Constellation Energy considering opening Three Mile Island – the site of the worst nuclear power plant disaster in the United States – to realize his ambitions in the field of artificial intelligence. But there are other ways to reduce AI power consumption, including using more efficient AI infrastructure.

Individually, we can think critically about whether we need to use an AI tool. According to Luccioni, you don't always need (or perhaps even want) a summary of AI information every time you search for something, and using alternative browsers can help with that. But part of the problem is that artificial intelligence companies do not disclose the specific energy needs of their products.

“AI companies need to be transparent about their environmental impact… It's unacceptable that we don't have accurate numbers for the tools we use every day,” Luccioni said. “As users, we need to have the information we need to make sustainable decisions, and companies have a responsibility to provide us with this information.”

Leave a Comment