In a surprising revelation, UNESCO has found that simply asking shorter questions and using smaller AI models can cut artificial intelligence energy use by up to 90% — without affecting the quality of answers. The discovery comes as growing concerns mount over AI’s massive power consumption worldwide.
The findings were revealed at the AI for Good global summit in Geneva. According to the study, massive AI tools like ChatGPT, Gemini, and Copilot are using enormous amounts of electricity, with ChatGPT alone consuming as much energy annually as three million Ethiopians.
Also Read: Lahore Museum Set for Major $8 Million Restoration under UNESCO’s Plan
UNESCO highlighted how shorter prompts and smaller, more focused models reduce the need for heavy computation. A simple change — trimming a 300-word prompt to 150 words — combined with a domain-specific AI model resulted in 90% energy savings, with zero drop in performance.
Tech giants are already adapting. OpenAI, Google, Microsoft, and others have launched mini versions of their AI models to meet rising demand while keeping energy use in check. UNESCO is urging governments, companies, and users to adopt smarter, greener AI practices moving forward.