The new $10 word worth $455 billion

A new “richest man in the world” was crowned last week.

Larry Ellison briefly leapfrogged Elon Musk for the top spot.

Ellison is the world’s least-known centibillionaire. He founded tech giant Oracle Corp. (ORCL) in 1977 and still owns about 41% of the business.

That’s how Ellison’s net worth surged $100 billion in a single day after Oracle stock jumped 36% on blowout earnings:

An $800 billion company skyrocketing like a penny stock is nearly unprecedented.

Or at least it was before the artificial intelligence (AI) boom...

  • Ellison stunned Wall Street on the earnings call.

Oracle builds the giant databases where companies keep their records and runs them in its own global cloud. Because those systems already live in huge data centers, Oracle had a head start when AI showed up.

Now, it rents out stadium-sized GPU farms—picture 100,000+ Nvidia (NVDA) chips wired together—so that AI companies can train and run their models.

Oracle said its AI compute backlog exploded from $138 billion to $455 billion in just three months, representing future compute it essentially pre-sold.

On the call, Larry said:

We have signed significant cloud contracts with the who’s who of AI… The company that called us said, “We’ll take all the capacity you have that’s currently not being used anywhere in the world. We don’t care.” And I've never gotten a call like that.

That’s the sound of big tech pounding on Oracle’s door.

Microsoft (MSFT). Google (GOOGL). OpenAI. They’re all essentially saying the same thing: “Give us every spare server you have.”

  • Big tech will spend over $300 billion building out their AI data centers this year.

The race to build a “digital god” is the largest infrastructure buildout of our lifetimes.

In one year, AI infrastructure spend has grown to be on par with the entire Apollo program or the Interstate Highway System, in terms of 2025 dollars.

The AI buildout is so big, it’s acting as a kind of private stimulus project for the US economy.


Source: The Wall Street Journal

The reason Oracle is surging—and big tech is spending all this money—is because usage of AI models like ChatGPT and Gemini is off the charts.

Over 700 million people now use ChatGPT weekly. That’s two Americas! It’s the fastest-growing product in history.

Google’s Gemini is keeping pace.

In May, it processed 480 trillion tokens (chunks of text or images) per month. By June, it had already doubled to nearly 1 quadrillion tokens. And this week, it became the #1 most downloaded app on the App Store.

  • Big tech’s favorite $10 dollar word for AI usage is…

“Inference.”

In plain English, when you ask ChatGPT a question and it answers, that’s inference.

When people talk about AI infrastructure, they usually mean thousands of GPUs crunching away for weeks to train a giant model. That’s the “training” phase. 

We spent years training great AI models, and now you, me, and everyone else is using these models all day long. We’re entering the “inference” phase.

When asked what was driving Oracle’s $455 billion backlog, Larry Ellison said:

There is a huge amount of demand for inferencing. And if you think about it, in the end, all this money we’re spending on training is going to have to be translated into products that are sold, which is all inferencing.

  • Welcome to Phase 2 of the AI boom.

Phase 1 was all about chips. Nvidia supplied the “picks and shovels” that fueled the first rush.

That alone created a $300+ billion annual spending surge. Big tech will spend more on AI this year than Switzerland spends running its government.

We thank them for the profits they’ve handed us.

 

Phase 2 is about the plumbing of AI. The nuts and bolts that let this technology actually scale.

Networking gear to move oceans of data at lightning speed…

Cooling systems to stop giant data centers from melting down…

And power infrastructure. Because AI data centers chew through 10X–50X more energy than traditional ones…

  • Oracle’s 36% single-day surge showed us the kind of money that can be made owning Phase 2 winners.

But there are many smaller players that will benefit even more.

Take Credo Technology Group Holding Ltd. (CRDO), for example.

Credo builds ultra-fast networking gear that links GPUs together. It helps them talk with one another at lightning speed. Without companies like Credo, all those Nvidia chips would be little more than expensive paperweights.

Credo’s revenues nearly quadrupled over the last year thanks to strong AI demand. Its stock has been on a tear, too. Congratulations to Disruption_X members who just took a “Free Ride” after CRDO soared 108%.

There will be many more stocks like it.

  • Keep this quote in mind when investing in AI.

The great economist John Maynard Keynes once said, “The market can remain irrational longer than you can stay solvent.”

We’re three years into the AI boom, and I see a lot of folks already suffering from investing fatigue. They act as if the AI boom is already behind us.

I would remind them that disruptive megatrends last decades. The hardest thing about making money from megatrends isn’t the buying, it’s the holding on and staying interested.

My spin on Keynes’ quote: “Winners can keep on winning longer than you can stay curious.”

Invest accordingly.

Stephen McBride
Chief Analyst, RiskHedge

PS: Earlier this month, my co-author Chris Wood and I released our own AI “ETF” in our flagship Disruption Investor advisory. The three stocks in this ETF specifically participate in Phase 2 of the AI boom by creating the “nuts and bolts” that will keep this tech humming along for years.

If you’re interested in learning more about these three companies, you can join us in Disruption Investor here.