“My basic model of the world is cost of intelligence, cost of energy”
– OpenAI founder Sam Altman
Dear Reader,
The “intelligence curve” has already led to the creation of the third-most-valuable company in the world: Nvidia.
The company achieved that position through decades of hard work—and by accidentally being the only solution for the current AI bottleneck.
It started a quarter-century ago, when Nvidia invented the GPU, or graphics processing unit.
A GPU does exactly what Nvidia’s name says: “n” (a math term) + “vidia” (Latin root for “see”).
GPUs use complex math to render images and video that can be seen on a screen.
For two decades, Nvidia churned out ever-more-advanced GPUs—mostly for video gamers.
Then along came Bitcoin. Mining the cryptocurrency requires a powerful processor to efficiently perform difficult calculations. It was a perfect fit for GPUs.
And Nvidia started to take off.
Then Came A.I.
Training and using its trillion-parameter models required powerful chips—and GPUs, with their 100 billion transistors, were once again a perfect fit.
Sometime in 2023, we reached a tipping point for AI.
Demand for Nvidia’s GPUs skyrocketed.
And seemingly overnight, Nvidia transformed into an extremely powerful, extremely profitable company.
- AI GPU demand more than doubled Nvidia’s 2023 income.
- Revenue in Q1 2024 tripled year over year. Per-share earnings rose more than 700%.
- It’s added $1.2 trillion to its market cap—as much as Canada’s entire GDP—in the last twelve months.
Even Nvidia did not anticipate this level of success—and its unpreparedness is jeopardizing the future of AI.
GPU Wars: Harder to Get Than Drugs
In 2023, Sam Altman sat for an interview with HumanLoop.
- Nearly immediately after the synopsis was published, Sam made them take it down. (Note: We can’t even link to it because it doesn’t exist online anymore!)
The first item in the synopsis was this dead giveaway: “OpenAI is heavily GPU-limited.”
Not even OpenAI—with the power of almighty Microsoft behind it—cannot get enough GPUs to deploy its new models.
And that’s severely delaying OpenAI’s AI master plan:
- No multimodal models (e.g., images and sound)
- No longer-sequence models
- No finetuning of results
Just how many do they need? A single one of OpenAI’s next training supercomputers is expected to take more than 75,000 Nvidia GPUs.
And they’re just not available.
“GPUs are harder to get than drugs.”
– Elon Musk
That shortage has triggered a massive run on GPUs…
Bytedance, the company behind TikTok, ordered $1 billion worth of Nvidia GPUs to begin building its own AI.
Meta didn’t want to be left behind. So they decided to order “enough GPUs,” according to Zuckerberg—and then “double that.”
They’re buying the equivalent of 600,000 more Nvidia H100 chips, which can run $30,000 each, this year alone.
Microsoft is buying more than 400,000 GPUs for AI training and inference in 2024.
- And Nvidia can’t manufacture enough GPUs to let these companies pursue their AI ambitions.
Its highest-end GPU, the H100, remained sold out for most of 2023, even while Nvidia worked feverishly to increase production.
They’re ramping up to ship more than 400,000 of their top GPUs per quarter. Yet it’s estimated that demand currently exceeds supply by at least 50%.
Nvidia’s Plan to Save the World
Fortunately, Nvidia’s efforts have started to alleviate the GPU shortage.
Jensen Huang plans to quadruple production by the end of 2025, which should enable the company to fully meet demand.
GPU delivery lead times are softening, from 8-11 months to just 3-4 months.
And the major tech companies are beginning to make their own:
- Microsoft’s Stargate project, for example, will use custom-designed computing chips.
- Meta has produced its own advanced AI chip, Artemis, to reduce its reliance on Nvidia.
- Even Tesla is making its own chip, which is far more powerful than Nvidia’s.
The intelligence curve may finally be slowing.
But the power curve is just getting started. And mark my words: it will create the most powerful company in the world.
The One World Currency: The Electrodollar
This is the dawn of what I call the Electrodollar ™
The sheer scale of AI GPU growth is staggering:
- The computing power coming online is now increasing 10x every nine months.
And every one of those GPUs requires power. But power generation is not growing 10x every nine months.
Even the semiconductor manufacturers are sounding the alarm:
– Nvidia CEO Jensen Huang
Sales numbers make total AI GPU energy consumption simple to calculate.
Nvidia shipped 100,000 units in 2023. Those units will consume 7.3 TWh annually.
By 2027, Nvidia is expected to ship 1.5 million AI server units—consuming more than 85 TWh annually.
But that’s only if the power draw per unit remains the same. Instead, successive GPU models have taken more and more power.
In just two years, Nvidia has quadrupled the amount of power used by a single chip.
Its newest chip – Blackwell – uses 1,200 watts of power. That’s the same as the average U.S. household.
And developing the next generation of AI models requires millions of ever-more powerful chips, all taking ever-more power.
The primary roadblock to the future of AI is no longer technology.
It’s power.
But don’t take my word for it. Mark Zuckerberg was asked if there’s anything Meta doesn’t have the resources for—something that’s 10x bigger than Meta’s budget.
His response: “Energy.”
In other words, the same fervor and funds that were poured into GPUs are now coming for power.
Intelligence made Nvidia a multitrillion-dollar company—seemingly overnight.
- Energy will make another multitrillion-dollar company, equally quickly.
And it’s coming sooner than you think.
Musk, Altman, Bezos, and a handful of other powerful world leaders have quietly been working on the technology that will break AI free from its power constraints.
It’s the A.I. Kill Switch.
And it’s time for you to learn all about it – and how to profit.
Regards,
Marin Katusa
Chairman, Katusa Research
Details and Disclosures
Investing can have large potential rewards, but it can also have large potential risks. You must be aware of the risks and be willing to accept them in order to invest in financial instruments, including stocks, options, and futures. Katusa Research makes every best effort in adhering to publishing exemptions and securities laws. By reading this, you agree to all of the following: You understand this to be an expression of opinions and NOT professional advice. You are solely responsible for the use of any content and hold Katusa Research, and all partners, members, and affiliates harmless in any event or claim. If you purchase anything through a link in this email, you should assume that we have an affiliate relationship with the company providing the product or service that you purchase, and that we will be paid in some way. We recommend that you do your own independent research before purchasing anything.