
Have you ever wondered, “How much electricity is being used?” when using AI?
Hugging Face engineer Julien Delavande has created a tool to answer this very question.
The Hidden Costs Behind AI
Every AI model requires a significant amount of electricity when it is used. These systems, which are based on GPUs and special chips, consume significant amounts of power, especially when used on a large scale. The popularity of AI technologies is expected to sharply increase global electricity demand in the coming years.
Let’s not forget the environment
Many companies are choosing environmentally dangerous directions in the power supply for AI. Delavande’s tool aims to draw attention to this issue and encourage users to make informed decisions.
How does the tool work?
Delavande’s solution works on open-source platforms like Chat UI and calculates the power consumption of each message sent in real time on Meta’s Llama 3.3 70B or Google’s Gemma 3. The results are presented in Watt-hours or Joules. These figures are even comparable to household appliances:
For example, asking the Llama 3.3 70B to write a simple email uses 0.1841 Watt-hours, which is the same as turning on a microwave for 0.12 seconds!
Accuracy? This is just an estimate
Delavande emphasizes that these figures are estimates. However, the tool reminds us of one truth: every chatbot response has a price.
The future is in transparency
With projects like the “AI Energy Map,” we are striving for transparency in the open-source community. “Perhaps one day, ‘energy labels’ will become mandatory for every AI model,” Delavande and his colleagues say.
Leave a Reply