The app for independent voices

HSBC just dropped the most brutal AI report: OpenAI needs to raise at least $207 billion by 2030 so it can continue to lose money 😳

HSBC just published a 33-page analysis on OpenAI, and the takeaway is savage:

Even in their most optimistic scenario, OpenAI ends up with a $207 billion funding hole by 2030 🤯

Not because the business is collapsing.

But because the cost of intelligence is colliding with the laws of physics.

In essence, OpenAI isn’t a software company. It’s an industrial-scale energy and infrastructure project masquerading as one.

According to the HSBC model:

→ $250 billion committed to Microsoft

→ $38 billion committed to Amazon

→ 36 gigawatts of contracted compute (yes, gigawatts)

That’s enough power for a mid-sized country.

And the bill is up to $620 billion per year in data-center costs once everything goes online.

But here comes the wildest part.

HSBC assumes nearly everything goes right.

↳ 3 billion users

↳ 10% conversion to paid plans

↳ Massive enterprise adoption

↳ AI Ads

↳ AI Agents

↳ Jony Ive’s hardware moonshot

↳ 2% of the global ad market

↳ Zero competitive erosion

Even then… it’s not enough.

OpenAI still comes up short by $207B, and that’s even before adding the $10B safety buffer.

The report even hints that OpenAI may eventually need to “walk away” from parts of its cloud commitments because “less capacity is better than a liquidity crisis.”

But this isn’t just about OpenAI. This is about the entire AI economy.

If the leading company in the world - backed by Microsoft, NVIDIA, and Silicon Valley’s deepest VC pockets - can’t make the math work under ideal conditions, what does that say about everyone else? 🤔

It says the AI boom isn’t a software revolution.

→ It’s an infrastructure arms race.

→ It’s GPUs, power plants, cooling towers, land, water rights, grid upgrades, transformer stations, and supply chains.

→ It’s the biggest capex cycle since the industrial revolution - disguised behind a chat interface.

And suddenly the industry’s real bottlenecks become obvious:

Not talent.

Not models.

Not data.

Energy. Efficiency. Economics.

This is why the next breakthroughs won’t just be better models - they’ll be cheaper models.

Sparse architectures. TPU-scale alternatives. 20–60× efficiency gains.

AI that uses less, not more.

Because unless costs collapse, the race to AGI won’t be won by the smartest lab.

It will be won by the AI lab that can pay its power bill the longest.

Nov 29
at
8:30 AM

Log in or sign up

Join the most interesting and insightful discussions.