Editor’s note: Before we kick off today’s Jolt, a quick reminder: This is your last chance to save up to 50% on a subscription to Stephen’s flagship Disruption Investor service. After midnight tonight, this special sale is officially over. Go here to see the best deal pricing before it’s gone.
Now, here’s Stephen...
***
Can you name the best-performing “Magnificent 7” stock this year?
It’s not Nvidia (NVDA), Apple (AAPL), or Tesla (TSLA).
It’s internet search king Google (GOOGL).
The stock’s climbed 67% this year—more than 4X the S&P 500’s return.
Google is now a $3.8 trillion company!
I used to spend hours each day clicking on Google’s 10 blue links. For me and billions of others, it was the homepage of the internet.
Now, I barely use Google. Instead of clicking around to find my answer, I just ask an AI chatbot like ChatGPT.
RIP Google, right?
Wrong.
Search is going the way of the dodo, but…
Google managed to get its act together and launch its own ChatGPT competitor, Gemini.
The early versions of Gemini were atrocious. Remember this scandal from early 2024?
Source: Gemini
Google has since shed (most of) the politically incorrect nonsense and now makes the best AI models in the world.
Its new AI model Gemini 3 shot to the top of the AI rankings, blowing past ChatGPT 5 and its competitors. I’ve been using it over the past few weeks and can confirm it’s very good.
Cost.
We interact with ChatGPT through a screen. But behind the scenes, there is a lot being built to make that possible.
The AI buildout is the largest infrastructure project in history.
Amazon (AMZN), Microsoft (MSFT), Oracle Corp. (ORCL), Meta Platforms (META), and Google will spend over $400 billion building out their AI data centers this year.
In one year, AI infrastructure spending has grown to be on par with the entire Apollo program or the Interstate Highway System.
So far, most of that capital has gone into building AI models like ChatGPT, Gemini, or Perplexity. Make them smarter. Make them bigger. Make them better at reasoning.
This was the “training” phase.
The inference era.
When you ask ChatGPT a question and it answers, that’s inference. That’s the model “doing stuff.”
You, me, and just about every company, student, government, and developer are now using AI models constantly—all day long. Training happens once. Inference happens trillions of times every day, forever.
And the better AI gets, the more complex tasks it can solve… the more thinking it has to do… and the more chips, compute, and electricity it will burn to make it all happen.
In the SaaS era, software had zero marginal cost of replication. In the AI era, every single answer has a real, physical cost in electricity and compute.
Cost is king. He who can produce the lowest-cost AI “tokens” has a huge advantage.
This is where Google shines, as it’s the only company making its own AI models and its own chips, which boosts efficiency.
Google has been making its own chips, TPUs (Tensor Processing Units), for over a decade.
TPUs power almost everything Google does in AI—from Gemini to YouTube recommendations to real-time translation.
Think of them like the “Formula 1” racecars of the chip world. Purpose-built to do one thing (run AI) and do it really well.
The company owns the entire AI stack, top to bottom.
It has its own data centers, which are some of the most energy-efficient on the planet.
It has its own models. Gemini is now one of the strongest AIs on the market.
Google also owns the apps where AI gets deployed to billions of people: Google Search, YouTube, Android, Maps, Gmail, Workspace, and Waymo. Every one of these products can embed AI instantly and monetize it.
|
No other company comes close to this level of vertical integration. Everyone else either rents chips, rents data centers, rents distribution, or rents content.
In an era where each tiny improvement in AI’s intelligence dramatically increases inference costs, owning the whole stack is a massive advantage. It’s like Walmart (WMT) being able to sell the same groceries for 50% less than competitors.
Google was the textbook example of how incumbents get blindsided by disruption.
It had a working chatbot two years before ChatGPT and was too afraid to release it. It allowed OpenAI to waltz in uncontested and siphon off users at scale.
But the tide is shifting.
Google finally woke up and now looks like it’s going to be a major player in the AI race, which will dominate the next 5–10 years.
But as much as I admire Google’s turnaround, there are far better profit opportunities out there.
Google is already a $3.8 trillion giant. Could it nearly triple and go to $10 trillion? Sure.
But there’s far more upside in lesser-known AI winners.
From Day 1, I’ve said you want to own the companies on the receiving end of the AI spending boom—not the ones doing the spending.
In other words, own the winners from the AI infrastructure boom.
That’s worked beautifully for Disruption Investor members who doubled their money (or more) on several AI winners.
If you’re not a Disruption Investor member, this is a great time to get onboard. We’re still running our special Black Friday sale, which grants you up to 50% off my flagship service. But the doors on this deal shut at midnight tonight, so go here to make your final decision. Thank you.
Stephen McBride
Chief Analyst, RiskHedge