Skip to playerSkip to main contentSkip to footer
  • today
Being polite to your AI assistant could cost millions of dollars.

OpenAI CEO Sam Altman revealed that showing good manners to a ChatGPT model — such as saying “please” and “thank you” — adds up to millions of dollars in operational expenses.

Altman responded to a user on X (formerly Twitter) who asked how much the company has lost in electricity costs from people being polite to their models.

“Tens of millions of dollars well spent Sounds like someone saw what operating system Hal did in “2001: A Space Odyssey” and is going to be nice to their AI assistant just in case. Experts have also found that being polite to a chatbot makes the AI more likely to respond to you in kind.

Judging from Altman’s cheeky tone, that “tens of millions” figure likely isn’t a precise number. But any message to ChatGPT, no matter how trivial or inane, requires the AI to initiate a full response in real time, relying on high-powered computing systems and increasing the AI models rely heavily on energy stored in global data centers — which already accounts for about 2% of the global electricity consumption. According to Goldman Sachs (GS), each ChatGPT-4 query uses about 10 times more electricity than a standard Google (GOOGL) search.

computational load — thereby using massive amounts of electricity.— you never know,” the CEO wrote.

Category

🗞
News
Transcript
00:00Open AI CEO Sam Altman revealed that showing good manners to a chat GPT model, such as saying please and thank you, adds up to millions of dollars in operational expenses.
00:13Altman responded to a user on X, formerly Twitter, who asked how much the company has lost in electricity costs from people being polite to their models.
00:23Tens of millions of dollars.
00:24Well, Spence sounds like someone saw what operating system Hal did in 2001, a space odyssey, and is going to be nice to their AI assistant just in case.
00:34Experts have also found that being polite to a chat bot makes the AI more likely to respond to you in kind.
00:41Judging from Altman's cheeky tone, that tens of millions figure likely isn't a precise number.
00:48But any message to chat GPT, no matter how trivial or inane, requires the AI to initiate a full response in real time, relying on high-powered computing systems.
01:00And increasing the AI models rely heavily on energy stored in global data centers, which already accounts for about 2% of the global electricity consumption.
01:08According to Goldman Sachs GS, each chat GPT, 4Query uses about 10 times more electricity than a standard Google GOGL search.
01:20Computational load, thereby using massive amounts of electricity, you never know, the CEO wrote.

Recommended