On Monday OpenAI announced it signed a seven-year, $38 billion deal to buy cloud services from Amazon Web Services to support products such as ChatGPT And Sora. It is the company's first major computer technology deal since a fundamental restructuring last week that gave OpenAI more operational and financial freedom from Microsoft.
The agreement gives OpenAI access to hundreds of thousands of Nvidia GPUs to train and run its artificial intelligence models. “Scaling advanced AI requires massive and reliable computing,” OpenAI CEO Sam Altman said in a statement. “Our partnership with AWS strengthens a broad computing ecosystem that will drive the next era and bring cutting-edge artificial intelligence to everyone.”
OpenAI will reportedly begin using Amazon Web Services immediately, with all planned capacity coming online by the end of 2026, with room for further expansion in 2027 and beyond. Amazon plans to release hundreds of thousands of chips, including Nvidia chips 200 GB And GB300 AI accelerators on data clusters built to power ChatGPT answers, create AI videos, and train the next wave of OpenAI models.
Wall Street apparently liked the deal as Amazon shares hit record high on Monday morning. Meanwhile, shares of longtime OpenAI investor and partner Microsoft fell briefly following the announcement.
Huge AI Computing Demands
It's no secret that running generative AI models for hundreds of millions of people currently requires a lot of computing power. With chip shortages over the past few years, finding sources of this computing power has proven difficult. OpenAI is reportedly works on native GPU hardware to reduce the load.
But now the company needs to find new sources of Nvidia chips that will speed up AI calculations. Altman previously said the company plans spend $1.4 trillion It would develop 30 gigawatts of computing resources, enough to power approximately 25 million U.S. homes, according to Reuters.
					
			





