Markets
Vince Carter
Vince Carter of the US leaps over Frederic Weis of France to dunk (Darren McNamara/Getty Images)

Nvidia dunks on the doubters

CEO Jensen Huang and CFO Colette Kress dismantled most of the recent arguments and bear cases put forward by their naysayers.

Nvidia’s Q3 results and Q4 outlook provided an emphatic statement that speaks for itself: it’s still boom times for the company at the heart of AI.

And while actions (and numbers!) may speak louder than words, the Q3 conference call offered plenty to chew on. During both prepared remarks and the Q&A, CEO Jensen Huang and CFO Colette Kress systematically addressed and dissected most of the recent arguments and bear cases put forward by their naysayers — whether it was brought up in a question or not.

There were flexes galore.

Huang spoke not only as the CEO of the world’s largest company, but also as an ambassador for AI, justifying the immense spending that benefits his firm by pointing to the rewards he believes his customers will reap.

This is the kind of conference call that will either have people revisiting some of these quotes and going, “I should have known this would be the first $6 trillion company,” or, “All this hubris was such a big tell that the AI trade was doomed.”

Or, depending on how much of a sense of humor the market gods have, both!

AI bubble?

“There’s been a lot of talk about an AI bubble. From our vantage point, we see something very different… The transition to accelerated computing is foundational and necessary, essential in a post-Moore’s Law era. The transition to generative AI is transformational and necessary, supercharging existing applications and business models. And the transition to agentic and physical AI will be revolutionary, giving rise to new applications, companies, products and services.” –Jensen Huang

Nvidia putting up massive sales numbers is not, in and of itself, proof in favor of or against an AI bubble.

A bubble needs irrationality, whether that be in valuations or earnings. Nvidia came into this report trading at its lowest valuation relative to the S&P 500 since June (a forward price-to-earnings premium of less than 13%). So, the valuation bubble argument isn’t of particular relevance to Nvidia at this juncture. Of greater concern is the potential for an “earnings bubble” — that is, Nvidia is benefiting from spending that ultimately won’t make much sense from the perspective of its customers, and is poised to retrench sharply once they figure that out.

Huang makes the argument that there’s no irrationality here because of the applications AI already has as well as the fresh opportunities it unlocks. In short, he’s saying the big spenders have many reasons to spend big.

Success stories

And to justify that, management talked up their customers’ wins.

“RBC is leveraging agentic AI to drive significant analyst productivity, slashing report generation time from hours to minutes. AI and digital twins are helping Unilever accelerate content creation by 2x and cut costs by 50%. And Salesforce’s engineering team has seen at least 30% productivity increase in new code development after adopting Cursor.” –Colette Kress

What really jumped out, however, was the CEO’s shout-outs to Meta. Mark Zuckerberg’s company has been a major laggard in the AI space as of late. Like other hyperscalers, it has a massive capital expenditure budget, but unlike that group, it doesn’t have a cloud business. That is, its spending is more “downstream” in nature than its peers; it relies on AI to make money, not on someone else wanting AI compute to make money.

“Meta’s GEM, a foundation model for ad recommendations trained on large-scale GPU clusters, exemplifies this shift. In Q2, Meta reported over a 5% increase in ad conversions on Instagram and 3% gain on Facebook feed, driven by generative AI-based GEM.” –Jensen Huang

His underlying message: everyone’s AI spending pays dividends, even if the market isn’t rewarding it at this moment.

Burry buried

Michael Burry of “The Big Short” fame recently raised concerns about whether Nvidia’s customers are understating depreciation, arguing that the GPUs they’ve bought should be losing value faster than the balance sheets of these buyers suggest.

Nvidia’s CFO came not to praise Burry, but to bury him:

“The long useful life of Nvidia’s CUDA GPUs is a significant TCO [total cost of ownership] advantage over accelerators. CUDA’s compatibility and our massive installed base extend the life of NVIDIA systems well beyond their original estimated useful life.

For more than two decades, we have optimized the CUDA ecosystem, improving existing workloads, accelerating new ones, and increasing throughput with every software release. Most accelerators without CUDA and Nvidia’s time-tested and versatile architecture became obsolete within a few years as model technologies evolve. Thanks to CUDA, the A100 GPUs we shipped six years ago are still running at full utilization today, powered by vastly improved software stack.” –Colette Kress

As an aside, the idea that Nvidia’s chips remain very useful for a long time is something that, on the surface, seems much better for Nvidia’s customers than its sales outlook, but it’s really hard to nitpick that given how much demand is in the pipeline.

Growth runway

One nagging fear about this reporting period was that Nvidia had already given investors the good news: in late October, Huang said the company already had more than $500 billion in orders for its flagship chips through 2026.

When a company is growing this much, this fast, it’s reasonable to ask questions about how much more of an appetite there is out there to be sated.

And Nvidia has answers, both on how big it expects the AI market to get and how much demand it thinks it’ll realize in the near term.

“We believe Nvidia will be the superior choice for the $3 trillion to $4 trillion in annual AI infrastructure build we estimate by the end of the decade.” –Colette Kress

“For example, just even today, our announcements with KSA, and that agreement in itself is 400,000 to 600,000 more GPUs over three years. Anthropic is also net new. So there’s definitely an opportunity for us to have more on top of the $500 billion that we announced.” –Colette Kress

Off the chain

Having all this demand is one thing; meeting it is another. Such worries have been in ascendance with memory chip price hikes abound and Huang recently asking TSMC to boost production.

Its Blackwell ramp was not necessarily seamless. But...

“Our ecosystem will be ready for a fast Rubin ramp.” –Colette Kress

Nvidia’s answer, in short, is that its supply chains are decades in the making, and everyone wants to work with the leader in the space.

“Our supply chain has been working with us for a very long time. And so in many cases, we’ve secured a lot of supply for ourselves, because obviously, they’re working with the largest company in the world in doing so.” –Jensen Huang

“The supply chain, we have much better visibility and control over it, because obviously we’re incredibly good at managing our supply chain. We have great partners that we’ve worked with for 33 years. And so the supply chain part of it, we’re quite confident. Now looking down our supply chain, we’ve now established partnerships with so many players in land and power and shell, and of course financing. These things — none of these things are easy, but they’re all tractable and they’re all solvable things. And the most important thing that we have to do is do a good job planning. We plan up the supply chain, down the supply chain. We’ve established a whole lot of partners. And so we have a lot of routes to market.” –Jensen Huang

Of marginal concern

Meeting high demand at a time of increasing pressures up and down the supply chain had some analysts worried about the outlook for Nvidia’s profitability.

Expect more of the same, management said.

“Earlier this year, we indicated that through cost improvements and mix that we would exit the year in our gross margins in the mid-70s. We achieved that and getting ready to also execute that in Q4. So now it’s time for us to communicate where are we working right now in terms of next year. Next year, there are input prices that are well known in industries that we need to work through... So we’re taking all of that into account, but we do believe if we look at working again on cost improvements, cycle time, and mix, that we will work to try and hold at our gross margins in the mid-70s.” –Colette Kress

First among unequals

The last question Nvidia faced on the conference call related to the competitive threat posed by custom chips (or ASICs). Google’s recently released Gemini 3 model, for instance, was trained using its in-house TPU chips, and offers some cost advantages, particularly when it comes to the price of inputting tokens, as well providing power efficiencies.

While Huang was serving as an ambassador for AI, he’s of course first and foremost an advocate for Nvidia’s AI solutions.

He didn’t take on the question of GPUs vs. ASICs directly. Instead, he offered the following arguments in favor of Nvidia-centric systems.

“Back in the Hopper day and the Ampere days, we would build one GPU. That’s the definition of an accelerated AI system. But today, we’ve got to build entire racks, entire three different types of switches, a scale-up, a scale-out, and a scale-across switch. And it takes a lot more than one chip to build a compute node anymore.” –Jensen Huang

Translation: we’re not in Kansas anymore. We’re not just focused on making the best chip, but also the best total package.

And when wrapping his list of the five things that make Nvidia special, Huang said:

“The most important thing, the fifth thing, is if you are a cloud service provider, if you’re a new company like Humain, if you’re a new company like CoreWeave or Nscale or Nebius, or OCI for that matter, the reason why Nvidia is the best platform for you is because our offtake is so diverse. We can help you with offtake. It’s not about just putting a random ASIC into a data center.”

Simply, it’s easier to sell capacity using Nvidia’s architecture because its CUDA software is ubiquitous in high-performance computing.

More Markets

See all Markets
markets

Insurance against Oracle default becomes favorite AI-bust hedge, Bloomberg reports

Volume in the market for credit default swaps — essentially a kind of insurance against a company defaulting on its debts — on Oracle is surging as the company has supercharged its borrowing to finance its AI ambitions, Bloomberg’s Caleb Mutua reports:

“The price to protect against the company defaulting on its debt for five years tripled in recent months to as high as about 1.11 percentage point a year on Wednesday, or around $111,000 for every $10 million of principal protected, according to ICE Data Services.

As AI skeptics rushed in, trading volume on the company’s CDS ballooned to about $5 billion over the seven weeks ended Nov. 14, according to Barclays Plc credit strategist Jigar Patel. That’s up from a little more than $200 million in the same period last year.”

“The price to protect against the company defaulting on its debt for five years tripled in recent months to as high as about 1.11 percentage point a year on Wednesday, or around $111,000 for every $10 million of principal protected, according to ICE Data Services.

As AI skeptics rushed in, trading volume on the company’s CDS ballooned to about $5 billion over the seven weeks ended Nov. 14, according to Barclays Plc credit strategist Jigar Patel. That’s up from a little more than $200 million in the same period last year.”

markets

Cipher Mining surges on additional AI hosting deal

Bitcoin miner turned AI compute power provider Cipher Mining jumped early Thursday after announcing a deal that fully leases its Barber Lake data center in Colorado City, Texas.

The deal — which is also giving a lift to IREN, another miner turned compute provider — is an expansion of a previous agreement with Fluidstack, a UK-based provider of GPU-based cloud networks. The new deal amounts to roughly $830 million in additional revenue over 10 years, Cipher says.

The market clearly loves it. But it’s worth pointing out that this agreement is a pretty good example of the byzantine financial structures that are increasingly accompanying plans for many billions of dollars of spending on the AI boom.

For example, Cipher also announced Thursday that it would be borrowing $333 million to finance an expansion of that Barber Lake data center through a private placement of debt.

That offering will be secured, in part, by the warrants Google received to purchase Cipher common stock worth roughly 5.4% of the company. (Those warrants, by the way, look a lot more valuable today, with Cipher mining up double digits.) Google is also backstopping Fluidstack’s borrowing plans to finance its build-out to the tune of $1.4 billion.

For now, this makes financial sense. Alphabet — one of the most successful companies on the planet — needs the computing power to compete in the AI race. And the quickest way to get that capacity is to essentially cosign leases for the smaller companies taking the lead in that build-out, thereby lowering development costs and helping to bring projects into existence.

But in this deal alone, things get awfully complicated awfully quickly, as Alphabet is essentially the prime customer of, an important debt guarantor for, and potentially a significant owner in Cipher Mining, once it transfers the warrants into an ownership stake of more than 5%.

This isn’t, on its face, a terrible thing. There are precedents for circular funding relationships in industries like aerospace, as it developed from the 1920s to the 1950s.

But financial complexity does have a history of essentially hiding the level and locus of financial risks a system is building up, essentially during periods of heady optimism.

The market clearly loves it. But it’s worth pointing out that this agreement is a pretty good example of the byzantine financial structures that are increasingly accompanying plans for many billions of dollars of spending on the AI boom.

For example, Cipher also announced Thursday that it would be borrowing $333 million to finance an expansion of that Barber Lake data center through a private placement of debt.

That offering will be secured, in part, by the warrants Google received to purchase Cipher common stock worth roughly 5.4% of the company. (Those warrants, by the way, look a lot more valuable today, with Cipher mining up double digits.) Google is also backstopping Fluidstack’s borrowing plans to finance its build-out to the tune of $1.4 billion.

For now, this makes financial sense. Alphabet — one of the most successful companies on the planet — needs the computing power to compete in the AI race. And the quickest way to get that capacity is to essentially cosign leases for the smaller companies taking the lead in that build-out, thereby lowering development costs and helping to bring projects into existence.

But in this deal alone, things get awfully complicated awfully quickly, as Alphabet is essentially the prime customer of, an important debt guarantor for, and potentially a significant owner in Cipher Mining, once it transfers the warrants into an ownership stake of more than 5%.

This isn’t, on its face, a terrible thing. There are precedents for circular funding relationships in industries like aerospace, as it developed from the 1920s to the 1950s.

But financial complexity does have a history of essentially hiding the level and locus of financial risks a system is building up, essentially during periods of heady optimism.

markets

Odds of December Fed cut creep higher after unemployment rate unexpectedly rises in September

The September jobs report was a mixed bag: much better job growth than anticipated, but the unemployment rate unexpectedly edged higher.

The release of this data, which was delayed by the government shutdown, showed that nonfarm payrolls grew 119,000 (compared to the expected 51,000), but the unemployment rate crept up to 4.4%, while economists thought it would remain steady at 4.3%.

Event contracts show that the likelihood of the US central bank standing pat in December moderated to about 65% from around 75% prior to the release.

(Event contracts are offered through Robinhood Derivatives, LLC — probabilities referenced or sourced from KalshiEx LLC or ForecastEx LLC.)

Job growth for the prior two months was revised lower by 33,000.

The market-implied odds of a Fed cut in December tanked on Wednesday after the Bureau of Labor Statistics said that the updated employment statistics through November wouldn’t be published until December 16 — that is, the week after the US central bank’s last meeting of the year.

During the press conference that followed the October decision to lower rates by 25 basis points, Fed Chair Jerome Powell said that a dearth of fresh data “could be an argument in favor of caution about moving,” adding that a rate cut in December was “far from” a foregone conclusion.

Fedspeak since that October rate cut has generally tilted hawkish. Some voting members like Boston Fed President Susan Collins and Kansas City President Jeffrey Schmid (who dissented from the last cut) have signaled that they are unlikely to support an interest rate cut in December. Fed Governors Chris Waller and Stephen Miran have publicly endorsed another rate reduction, while other officials have yet to take a definitive stance. The minutes from the October meeting said that “many participants” thought “it would likely be appropriate to keep the target rate unchanged for the rest of the year.”

Walmart cart

Walmart beats Wall Street estimates, hikes sales forecast

The retail giant beat on earnings and revenue while also raising its sales forecast.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.