Two months ago, OpenAI and Nvidia announced a $100 billion partnership to build 10 gigawatts of AI compute. Sam Altman was talking about $1.4 trillion in total infrastructure commitments. The numbers were staggering and, it turns out, unsustainable.
This week, reality arrived. Reuters reported that OpenAI has told investors it now targets roughly $600 billion in total compute spend through 2030 -- less than half of Altman's earlier $1.4 trillion figure. Simultaneously, the Financial Times confirmed that Nvidia's investment has been restructured from $100 billion to a $30 billion pure equity stake, with no chip-purchase obligations attached.
These are not small adjustments. This is the most significant recalibration of AI spending expectations since the current boom began.
What Changed in the Nvidia Deal
The original Nvidia-OpenAI partnership, announced in September 2025, was structured as a vendor-subsidy arrangement: Nvidia would invest capital, and OpenAI would use it to buy Nvidia hardware. That made Nvidia simultaneously a supplier and an investor, creating a circular flow of money that guaranteed GPU sales while building equity value.
That structure has been completely abandoned. According to the Financial Times, Nvidia will instead take a straight equity stake in OpenAI with no deployment milestones and no hardware-purchase requirements. OpenAI retains full discretion over where it spends the money -- including on competing chip vendors like AMD and Broadcom.
For Nvidia, this is a cleaner investment but a riskier one. As a pure equity holder, Nvidia's return depends entirely on OpenAI's valuation growth rather than guaranteed hardware revenue. For OpenAI, it means freedom to shop for the best deals on compute hardware without contractual obligations to any single supplier.
Jensen Huang dismissed earlier reports of deal trouble as "complete nonsense" during a press encounter in Taipei on January 31. Yet the final terms confirm the deal was in active renegotiation even as he spoke. The $30 billion stake, while still enormous, is 70% smaller than the original commitment.
The $600 Billion Compute Recalibration
The bigger story is what is happening inside OpenAI's planning. According to Reuters, OpenAI told investors it expects roughly $600 billion in total compute spending through 2030 as the company lays groundwork for a potential IPO valued at up to $1 trillion.
That $600 billion figure replaces the $1.4 trillion infrastructure number Altman had discussed months earlier. Benzinga reported the revised target emerged alongside concerns that expansion goals had outpaced practical reality. The new projection comes with a clearer timeline and, critically, revenue projections to match: OpenAI expects $280 billion in revenue by 2030.
To put that in perspective: $600 billion in compute spend against $280 billion in projected revenue means OpenAI is planning to spend more than twice its revenue on infrastructure alone over the next four years. That ratio only works if the company can sustain massive external funding -- which is exactly what the current $100 billion+ fundraising round is designed to provide.
Why the Numbers Came Down
Three factors drove the recalibration.
First, efficiency gains are real. DeepSeek's emergence earlier this year demonstrated that frontier-quality AI can be trained at a fraction of the cost previously assumed. When a Chinese lab achieves competitive performance for pennies on the dollar, the argument for unlimited compute spending gets harder to make.
Second, investors demanded clarity. The gap between "$1.4 trillion in infrastructure commitments" and actual funded capacity was always uncomfortable. Investors preparing for OpenAI's IPO need credible numbers, not aspirational ones. The $600 billion target is lower but more defensible.
Third, the competitive landscape shifted. Google's Gemini 3.1 Pro achieved state-of-the-art results. Anthropic's Claude Opus 4.6 is competing head-to-head on coding and reasoning. OpenAI cannot assume it will capture enough market share to justify spending at twice the rate of all competitors combined.
What This Means for the AI Market
The scaling-back narrative is not about AI losing momentum. It is about the economics getting more disciplined.
OpenAI's revised fundraising round is still targeting over $100 billion total, with participation from Amazon, Microsoft, SoftBank, and Nvidia. The company's valuation -- estimated at $730 to $850 billion depending on the source -- would make it one of the most valuable private companies in history.
But the structure of the money is changing. Nvidia's shift from vendor-subsidy to pure equity means OpenAI can diversify its hardware supply chain. That is good news for AMD, Broadcom, and every cloud provider offering AI compute. Competition for OpenAI's hardware dollars will increase, which should push compute prices down for everyone in the ecosystem.
For businesses building on OpenAI's APIs, the path to cheaper inference just got more likely. When the company projecting $280 billion in revenue is also shopping for the cheapest compute available, the cost reductions flow downstream to customers.
What Small Businesses Should Take Away
You are not investing $30 billion in anything. But the dynamics reshaping AI's biggest deals will directly affect the tools you use.
AI costs will keep falling. The combination of efficiency breakthroughs, hardware competition, and more disciplined spending means the cost of running AI workloads will continue to decline. If you have been waiting for AI APIs to get cheaper before expanding your usage, that trend is accelerating.
Vendor lock-in is loosening. Nvidia's loss of purchase-commitment leverage over OpenAI signals a broader shift. If the world's leading AI lab refuses to be locked into a single chip vendor, smaller businesses should adopt the same flexibility. Build your AI workflows to be portable across providers.
The IPO timeline matters. OpenAI is positioning for a public offering at up to $1 trillion. Public companies face different pressures than private ones -- quarterly earnings, margin expectations, and shareholder scrutiny. That could mean more aggressive API pricing to drive revenue, or it could mean premium tiers get more expensive. Watch the IPO timeline as a signal for pricing changes.
Reality checks are healthy. The most important lesson from this week is that even the biggest players in AI are adjusting their plans when the numbers do not add up. Small businesses should do the same: set ambitious AI goals, but build in checkpoints to verify that spending is generating real returns.
The Bottom Line
OpenAI cutting its compute target by more than half and Nvidia restructuring its investment to remove hardware obligations tells you one thing clearly: the era of blank-check AI spending is maturing into something more calculated. The technology is not slowing down. The economics are growing up.
For your business, that is entirely good news. More competition, more discipline, and more downward pressure on costs means the AI tools you rely on are getting better and cheaper at the same time.
Need help building an AI strategy that scales without overspending? Barista Labs helps small businesses invest in AI that delivers measurable returns -- no trillion-dollar budget required.
