Sam Altman’s Gamble: Will OpenAI’s Aspirations Match the Industry’s Growing Expenses?

It’s a staggering $1.4 trillion (£1.1 trillion) dilemma. How can a startup like OpenAI, which is currently operating at a loss, afford such enormous expenses?

A positive answer to this question could significantly ease investor worries about potential bubble bursts in the burgeoning artificial intelligence sector, including the high valuations of tech companies and a global expenditure of $3 trillion on data centers.

The firms behind ChatGPT require extensive computing resources (or “compute”) to train their models, generate responses, and develop even more advanced systems going forward. OpenAI’s computing obligations (AI infrastructure such as chips and servers supporting its renowned chatbots) are projected to reach $1.4 trillion over the next eight years, overshadowing its annual revenue of $13 billion.


Recently, this disparity has appeared to be a significant concern, leading to market unease regarding AI expenditures and remarks from OpenAI leaders who have not sufficiently clarified these issues.

OpenAI CEO Sam Altman initially attempted to address the situation during a somewhat awkward discussion with Brad Gerstner of Altimeter Capital, the company’s leading investor, but concluded with Altman’s assertion that “enough is enough.”

On his podcast, Gerstner articulated that the company’s capacity to cover more than $1 trillion in computing expenses while yielding only $13 billion in annual revenue is an issue “plaguing the market.”

Altman countered by stating, “First of all, we’re generating more than that. Secondly, if you want to sell your stock, I can find you a buyer; I’ve had enough.”

Last week, OpenAI’s Chief Financial Officer Sarah Friar suggested that some of the chip expenses could be offset by the U.S. government.

“We’re exploring avenues where banks, private equity, and even governmental systems can help finance this,” she mentioned to the Wall Street Journal, noting that such assurances could significantly lower financing costs.

Was OpenAI, which recently declared itself a full-fledged for-profit entity valued at $500 billion, implying that AI companies should be regarded similarly to banks during the late 2000s? This led to a quick clarification from Friar, who denied on LinkedIn that OpenAI was seeking federal reassurance while Altman aimed to clarify his stance on X.

“We neither have nor want government guarantees for OpenAI data centers,” Altman wrote in an extensive post, adding that taxpayers shouldn’t be responsible for rescuing companies that make “poor business choices.” Perhaps, he suggested, the government should develop its own AI infrastructure and provide loan assurances to bolster chip manufacturing in the U.S.

Tech analyst Benedict Evans remarked that OpenAI is trying to compete with other major AI contenders supported by substantial existing profit models, including Meta, Google, and Microsoft, who are significant backers of OpenAI.

“OpenAI aims to match or surpass the infrastructure of dominant platform companies that have access to tens of billions to hundreds of billions of dollars in computing resources. However, they rely on cash flow from current operations to afford this, something OpenAI lacks, and they’re working to gain entry into that exclusive circle independently,” he noted.

Altman is confident that the projected $1.4 trillion can be offset by future demand for OpenAI products and ever-evolving models. Photo: Stephen Brashear/AP

There are also concerns surrounding the cyclical nature of some of OpenAI’s computing agreements. For instance, Oracle is set to invest $300 billion in developing new data centers for OpenAI across Texas, New Mexico, Michigan, and Wisconsin, with OpenAI expected to reimburse almost the same amount in fees for those centers. According to its agreement with Nvidia, a primary supplier of AI chips, OpenAI will purchase chips for cash, while Nvidia will invest in OpenAI as a non-controlling stakeholder.

Altman has also provided updates on revenue, stating that OpenAI anticipates exceeding $20 billion in annual revenue by the year’s end and reaching “hundreds of billions of dollars” by 2030.

He remarked: “Based on the trends we’re observing in AI utilization and the increasing demand for it, we believe that the risk of OpenAI lacking sufficient computing power is currently more pressing than the risk of having excess capacity.”

Skip past newsletter promotions

In essence, OpenAI is confident that it can recover its $1.4 trillion investment through anticipated demand for its products and continually enhancing models.

The company boasts 800 million weekly users and 1 million business customers, deriving income from consumer ChatGPT subscriptions – which accounts for 75% of its earnings – in addition to offering enterprises a specific version of ChatGPT and allowing them to leverage its AI models for their own products.

A Silicon Valley investor, who has no financial ties to OpenAI, emphasizes that while the company has the potential for growth, its success hinges on various factors like model improvements, reducing operational costs, and minimizing the expenses of the chips powering these systems.

“We believe OpenAI can capitalize on its strong branding and ChatGPT’s popularity among consumers and businesses to create a suite of high-value, high-margin products. The crucial question is: how extensively can these products and revenue models be able to scale, and how effective will the models ultimately prove to be?”

However, OpenAI currently operates in the red. The company contends that figures regarding its losses are misrepresented, such as claims of an $8 billion loss in the first half of the year and about $12 billion in the third quarter, yet it does not dispute these losses or provide alternative figures.

Altman is optimistic that revenue may stem from multiple sources, including heightened interest in paid ChatGPT versions, other organizations utilizing their data centers, and users purchasing the hardware device being crafted in collaboration with iPhone designer Sir Jony Ive. He also asserts that “substantial value” will emerge from scientific advancements in AI.

Ultimately, OpenAI is banking on needing $1.4 trillion in computing resources, a figure far from its current income, because it is convinced that demand and enhancements to its product lineup will yield returns.

Karl Benedict Frey, author of “How Progress Ends” and an associate professor of AI at the University of Oxford, casts doubt on OpenAI’s aspirations, citing new concerns and evidence of a slowdown in AI adoption in the U.S. economy. Recently, the U.S. Census Bureau reported that companies with 250 or more employees have experienced a decline in AI adoption.

“Multiple indicators reveal that AI adoption has been decreasing in the U.S. since summer. While the underlying reasons remain unclear, this trend implies a shift where some users and businesses feel they aren’t receiving the anticipated value from AI thus far,” Frey stated, adding that achieving $100 billion in revenue by 2027 (as suggested by Altman) would be impossible without groundbreaking innovations from the company.

OpenAI claims that its enterprise ChatGPT version has grown ninefold year-over-year, accelerating business acceptance, with clientele spanning sectors, including banking, life sciences, and manufacturing.

Yet, Altman acknowledges that this venture might not be a guaranteed success.

“However, we could certainly be mistaken, and if that’s the case, the market will self-regulate, not the government.”

Source: www.theguardian.com

Leave a Reply

Your email address will not be published. Required fields are marked *