The round includes $50 billion from Amazon, $30 billion from Nvidia and $30 billion from SoftBank Group. OpenAI said it now serves more than 900 million weekly active users, including over 50 million paying consumer subscribers and more than 9 million business users.
In a blog post announcing the new funding, OpenAI framed it as part of a broader effort to industrialize artificial intelligence, arguing that the next phase of AI progress depends on expanding compute capacity, lowering inference costs and building durable global infrastructure. The company said it is entering a period where scaling laws, data center investment and chip supply chains will shape competitive advantage as much as model architecture.
The funding is tightly linked to infrastructure commitments. As part of the deal, Amazon Web Services will become the exclusive third-party cloud provider for OpenAI’s Frontier program, and OpenAI will expand prior infrastructure agreements with AWS that could total $100 billion over eight years. OpenAI also said it will use dedicated inference and training capacity on Nvidia’s next-generation systems, deepening its reliance on specialized AI hardware.
The scale of the raise reflects how expensive frontier AI has become. According to PYMNTS, OpenAI’s compute spending could approach $600 billion by 2030 as models grow larger and usage expands across consumer and enterprise markets. That projection underscores why access to capital and long-term infrastructure agreements are increasingly strategic assets in the AI race.
OpenAI’s consumer momentum remains strong. PYMNTS previously reported that ChatGPT leads global consumer AI usage as the company rolls out higher-priced subscription tiers aimed at professionals and power users. Those tiers are designed to convert widespread usage into recurring revenue streams that can help offset infrastructure costs.
Competition, however, is intensifying. PYMNTS also reported that Anthropic’s valuation has climbed to $380 billion amid accelerating enterprise demand for AI systems, signaling that large language model providers are competing not only on research but on distribution, pricing and ecosystem partnerships.
For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.