top of page

The next “ChatGPT moment” is here

Stephen McBride

Stephen McBride

May 11, 2026

Intel Corp. (INTC) has been a dog for decades.

 

Now, its stock is taking off like a rocket… up nearly 500% in the last year.

 

 

On its latest earnings beat, its stock spiked 24% in a single day as data center builders "snap up any chips they can get."

 

The company that's been the punchline of tech investing for two decades is now cool again!

 

I must confess, I didn’t see that coming.

 

Intel is surging because plain ol’ central processing units (CPUs) are suddenly the new artificial intelligence (AI) bottleneck.

 

And that’s setting up a great investing opportunity… but not in Intel.

 

  • CPUs are the most boring, mature, commoditized part of the chip industry.

 

So why are data center builders snapping up any CPUs they can get their hands on?

 

Because for the first three years of the AI boom, almost all the action was in training—teaching an AI model how to think.

 

Training an AI involves feeding it the entire internet, giving it billions of examples, and running it through enormous, power-hungry GPU clusters for months at a time. ChatGPT-4 reportedly cost roughly $100 million worth of training compute. GPT-5 was several times that.

 

This is the world the GPU was built for. Training is one massive, parallel mathematical workout. And Nvidia's (NVDA) GPUs, which can do thousands of those calculations at once, are perfect for it.

 

But in 2026, training is no longer the main event. Inference is.

 

Inference is what happens after a model is trained. It's every time someone asks ChatGPT a question. Every time Claude writes code. Every time an AI agent books a meeting. It’s AI putting its brain to work.

 

According to Nvidia's chief scientist, inference now consumes 90% of all data center power for AI. That’s a complete inversion of where we were just two years ago.

 

And that’s creating massive demand for boring ol’ CPUs.

 

Microsoft (MSFT) has allocated every single one of its spare CPUs to its AI partners, OpenAI and Anthropic. The shortage is so severe, it's been causing outages on developer tools like GitHub.

 

  • You've likely heard of the "ChatGPT moment"…

 

In November 2022, ChatGPT launched and became the fastest product to ever hit 100 million users.

 

That kicked off the epic AI-driven bull market we’re still whirling through today. Nvidia has surged more than 1,200% since ChatGPT came out. Anthropic ballooned from a small $4 billion startup to receiving a $900 billion valuation in just three years.

 

On February 5, 2026, an even bigger moment happened: Anthropic released Claude Opus 4.6.

 

Older AI systems were mostly designed for short interactions. You asked a question. They gave you an answer.

 

Claude Opus 4.6 is different. It was built specifically for long, complex work. Instead of handling one small task at a time, it can work through large, complicated problems for hours and research information on its own.

 

AI just went from "helps you with your work" to "does your work.”

 

I’m a self-confessed Claude addict. I have my little AI assistant running tasks for me all day, every day. I happily pay $200/month for it and would pay 10X more.

 

This is the second “ChatGPT moment.”

 

Anthropic's annual revenue run rate exploded from $9 billion to $30 billion in just four months. Amazon (AMZN) revealed last week that its AI cloud business has hit a $15 billion run rate, making it 260X bigger than AWS itself was at the same age.

 

  • How do you profit from this?

 

With this new generation of AI—like Claude Opus 4.6—a single user request can spin off hundreds of AIs simultaneously.

 

One researching. One writing. One fact-checking. One planning the next step.

 

It’s less like asking one employee a question and more like managing an entire company in real time. But coordinating all those moving pieces requires massive amounts of CPU power.

 

Arm Holdings PLC's (ARM) CEO recently said each gigawatt of AI data center capacity will require 120 million CPU cores. That translates to a 4X surge in CPU demand per data center by 2030.

 

If the world suddenly needs millions more CPUs, ask yourself: “What do you need to make those CPUs? What are the picks and shovels of CPUs?”

 

You need chipmaking machines.

 

Specifically, you need incredibly complex industrial machines that deposit, etch, pattern, clean, and inspect the silicon that goes into chips powering your phone… your car… and every AI data center on Earth.

 

There are only a handful of companies on the planet capable of building these machines. The biggest names investors know are ASML Holding NV (ASML) in the Netherlands, along with Applied Materials (AMAT), Lam Research (LRCX), and California-based KLA Corp. (KLAC).

 

In Disruption Investor, we also own some smaller, higher-upside semi-cap stocks. These are the companies making the machines that make the chips (to see the full portfolio, upgrade here).

 

  • In the next 12 to 18 months, these companies will see the strongest demand in the history of the chip industry.

 

For the last two years, Intel and Samsung stopped expanding their chip factories.

 

Both had fallen behind Taiwan Semiconductor (TSM) in chipmaking technology. Customers had abandoned them. And there was no point spending billions of dollars on machines they didn't need.

 

That changed in the last 60 days.

 

Taiwan Semi is booked solid through 2026—and beyond—on the back of AI demand. The company's CEO just raised its full-year revenue growth and explicitly named "agentic AI demand" as the cause.

 

For the first time in at least five years, Taiwan Semi, Intel, and Samsung are all expanding their factories. They’re simultaneously placing orders for the same chipmaking machines from the same handful of suppliers.

 

That’s why ASML—the first stock we ever wrote about at RiskHedge—just told investors there will be a shortage of its EUV machines in 2027.

 

ASML won't be the last supplier to start warning that demand is outrunning supply. Expect more chip equipment companies to say the same thing.

 

Bottom line: The CPU surge is proof AI demand is bigger than even the bulls have projected.

 

The companies building the machines that make all the chips is where I'm putting my money for 2026 and 2027. I think it'll be one of the best AI trades of the entire boom.

 

If you want to position ahead of this semi-cap spending wave, our four-stock “ETF” in Disruption Investor is still a good buy at today’s prices. For details on how to join, go here.


Stephen McBride

Chief Analyst, RiskHedge

Share this article

logo-small-the-jolt.jpg

Where Innovation Meets Investing

 

 

Comments

Share Your ThoughtsBe the first to write a comment.
bottom of page