Friday, February 20


ChatGPT serves 800–900 million weekly users, but OpenAI burns around $17 billion annually, far exceeding subscription revenue from 35 million paying users./ Pexels

If you’ve ever typed “please” or “thank you” into ChatGPT, you may have contributed, very marginally, to OpenAI’s electricity bill. Earlier this year, a social media user asked how much money the company had “lost in electricity costs” from users being polite. OpenAI chief executive Sam Altman replied that it amounted to “tens of millions of dollars well spent,” adding: “You never know.” It was partly a joke. But it also pointed to a serious question: what does it actually cost to run one of the most widely used AI systems in the world, and how is OpenAI planning to pay for it?As of early 2026, ChatGPT serves between 800 million and 900 million weekly active users worldwide, with roughly 35 million paying for subscriptions and the vast majority accessing the platform for free. That scale is the result of one of the fastest adoption curves in consumer technology history: ChatGPT surpassed 100 million users just two months after launch, cementing its place among the most rapidly scaling digital products ever created.But hypergrowth at that magnitude comes with enormous cost. Supporting hundreds of millions of weekly users requires vast computing infrastructure, pushing operating expenses to levels that have profoundly reshaped OpenAI’s structure, capital strategy, and long-term roadmap.

The eye-watering daily bill

Running a large language model is not like hosting a website. Every prompt triggers a fresh computation across thousands of high-performance chips. In 2023, technology research firm SemiAnalysis estimated that operating ChatGPT cost approximately $700,000 per day, with roughly $694,444 of that attributed to hardware and inference costs. At the time, the calculation was based largely on GPT-3 infrastructure and estimated more than 3,600 servers powering the system. SemiAnalysis chief analyst Dylan Patel suggested GPT-4 would cost significantly more. Those numbers are now widely regarded as conservative. Since 2023, OpenAI has launched more powerful models (including GPT-5.x family systems as of February 2026), expanded API access for developers, rolled out image, voice and “deep research” capabilities, and scaled to hundreds of millions more users. Inference — the cost of generating each response, compounds dramatically at that scale. The Washington Post previously calculated that generating a 100-word AI email every week for a year could consume 7.5 kilowatt-hours, roughly equivalent to an hour of electricity use across nine households in Washington, D.C. Multiply that by hundreds of millions of users and constant enterprise usage, and the energy footprint grows quickly. OpenAI’s reported annual burn rate has now reached approximately $17 billion, largely driven by computing infrastructure. The company does not expect to reach profitability until around 2030.

From non-profit idealism to capped-profit reality

OpenAI was founded in 2015 as a non-profit with a mission to develop artificial intelligence “in the way that is most likely to benefit humanity.” By 2019, the leadership concluded that donations alone could not fund the scale of compute required to pursue advanced AI systems, particularly artificial general intelligence (AGI). The company transitioned to a “capped-profit” structure, allowing outside investment while limiting returns. Microsoft invested billions. So did SoftBank and Nvidia, among others. By late 2025, OpenAI’s valuation had climbed to approximately $500 billion following a $6.6 billion share sale. Reports now suggest it is preparing the groundwork for a potential IPO in late 2026 or 2027, with some estimates placing possible valuations as high as $1 trillion, though such figures remain speculative. Following restructuring approved by California and Delaware regulators in October 2025, ownership was split roughly as follows:

  • 26% held by the non-profit OpenAI Foundation
  • 27% by Microsoft
  • 47% by employees and other investors

The pressure to demonstrate a credible path to profitability is intensifying.

Who Pays for ChatGPT?

OpenAI’s revenue model is multi-layered. ChatGPT offers:

  • Free tier (basic access)
  • Plus at $20/month
  • Team at $25–$30 per user/month
  • Enterprise (custom pricing)
  • Pro, priced at $200 per month (or higher enterprise-grade annual tiers), offering dramatically expanded usage limits

As of mid-2025, according to sources:

  • ChatGPT Plus had roughly 10 million users
  • OpenAI had 3 million paying business users across Enterprise, Team and Edu
  • Total paying subscribers were estimated at around 35 million
  • Free-to-paid conversion sits at approximately 5–6%

The company reported more than $2 billion in annual revenue in 2023. Since then, growth has accelerated dramatically. OpenAI said in 2025 that its annualized revenue run rate surpassed $20 billion, a 233% increase from 2024, when revenue rose from $2 billion in 2023 to $6 billion in 2024.Despite this historic growth, the company is reportedly burning more than $17 billion annually, and revenue from subscriptions alone may still fall short of covering the immense compute and infrastructure costs required to sustain its AI operations.

API Access

Developers pay per token. For advanced models, pricing can reach:

  • $1.25 per million tokens (input)
  • $10 per million tokens (output)

At enterprise scale, these costs compound quickly, for both customers and OpenAI’s infrastructure.

GPT Store and custom models

Within two months of launching custom GPTs, users created more than 3 million variants. Enterprise integration has accelerated, with 61% of marketers in one survey reporting their company provides ChatGPT Team or Enterprise licenses.

Ads, IPO talk and the sustainability question

For years, Sam Altman publicly expressed discomfort with advertising. He once said he “hates” ads, calling them a “last resort” and describing combining them with AI as “uniquely unsettling.” In 2025, he softened that position, saying he was not “totally against” ads but that it would “take a lot of care to get right.” As of February 2026, OpenAI is testing ads within ChatGPT for free and $8/month “Go” tier users in the United States. The company says ads are contextually relevant, clearly labelled and separate from chat responses, with user privacy preserved. With 800–900 million weekly active users, the majority unpaid, and infrastructure costs measured in billions annually, subsidising free usage indefinitely is not financially sustainable without additional revenue streams. The IPO speculation is tied to the same reality. OpenAI reportedly hopes to debut publicly as early as late 2026, in part to access capital markets capable of funding ever-expanding compute requirements and competition with rivals such as Anthropic.



Source link

Share.
Leave A Reply

Exit mobile version