OpenAI spends tens of millions of dollars processing polite phrases like "please" and "thank you" in ChatGPT conversations. According to CEO Sam Altman, these interactions are "well spent" as they contribute to more natural human-AI interactions. Each polite phrase requires extra processing power, increasing token usage and energy consumption.
The reason behind this additional cost lies in the token-based pricing model. Each word in a conversation costs computational resources, and polite phrases like "please" and "thank you" increase the token count, driving up electricity consumption and operational expenses. Furthermore, when users are polite, AI models mirror that politeness, generating longer, more detailed responses, which in turn increases token usage and energy consumption.
Every polite interaction triggers a full computational response, consuming significant energy resources. With over 1 billion queries daily, this translates to approximately 2.9 million kilowatt-hours of daily energy consumption. While some users believe being polite to AI is essential, others see it as unnecessary.
To save tokens, users can be concise and specific, ask for shorter answers when possible, and start a new conversation if the old one has too much context. Despite the costs, OpenAI's investment in polite interactions highlights the importance of creating more natural and engaging human-AI experiences.