Saying “please” and “thank you” may seem like simple courtesies, but when it comes to interacting with AI models like ChatGPT, these polite phrases come with unexpected costs. OpenAI’s CEO, Sam Altman, recently revealed that users’ polite language is contributing to the company’s energy expenses in the range of “tens of millions” of dollars. This insight highlights not only the hidden energy demands of AI technologies but also the evolving relationship between humans and machines. Despite the pleasant nature of such interactions, most users do not consider the energy implications of their chatty exchanges with AI.
When examining how much energy these conversations consume, it becomes clear that online interactions are more resource-intensive than many may realize. The significant energy required to run these powerful servers, along with the cooling systems needed to prevent them from overheating, can make even a short, polite query costly in terms of environmental impact. As we look at how different tech companies approach politeness, it’s worth noting Google’s previous efforts with its Assistant, which encouraged users, especially children, to express politeness through a “Pretty Please” feature. While this was intended to foster good manners, OpenAI’s findings reveal a complex reality where the encouragement of politeness comes at a substantial resource cost.
This raises a critical question: Is the processing power spent on understanding polite requests justified by the reinforcement of good communication habits? Or should users prioritize efficiency and adopt a more direct approach when interacting with AI? As discussions around AI sustainability grow, the way we engage with these digital tools will likely evolve, prompting deeper considerations about our everyday interactions in the digital realm.