Markets
Vince Carter
Vince Carter of the US leaps over Frederic Weis of France to dunk (Darren McNamara/Getty Images)

Nvidia dunks on the doubters

CEO Jensen Huang and CFO Colette Kress dismantled most of the recent arguments and bear cases put forward by their naysayers.

Nvidia’s Q3 results and Q4 outlook provided an emphatic statement that speaks for itself: it’s still boom times for the company at the heart of AI.

And while actions (and numbers!) may speak louder than words, the Q3 conference call offered plenty to chew on. During both prepared remarks and the Q&A, CEO Jensen Huang and CFO Colette Kress systematically addressed and dissected most of the recent arguments and bear cases put forward by their naysayers — whether it was brought up in a question or not.

There were flexes galore.

Huang spoke not only as the CEO of the world’s largest company, but also as an ambassador for AI, justifying the immense spending that benefits his firm by pointing to the rewards he believes his customers will reap.

This is the kind of conference call that will either have people revisiting some of these quotes and going, “I should have known this would be the first $6 trillion company,” or, “All this hubris was such a big tell that the AI trade was doomed.”

Or, depending on how much of a sense of humor the market gods have, both!

AI bubble?

“There’s been a lot of talk about an AI bubble. From our vantage point, we see something very different… The transition to accelerated computing is foundational and necessary, essential in a post-Moore’s Law era. The transition to generative AI is transformational and necessary, supercharging existing applications and business models. And the transition to agentic and physical AI will be revolutionary, giving rise to new applications, companies, products and services.” –Jensen Huang

Nvidia putting up massive sales numbers is not, in and of itself, proof in favor of or against an AI bubble.

A bubble needs irrationality, whether that be in valuations or earnings. Nvidia came into this report trading at its lowest valuation relative to the S&P 500 since June (a forward price-to-earnings premium of less than 13%). So, the valuation bubble argument isn’t of particular relevance to Nvidia at this juncture. Of greater concern is the potential for an “earnings bubble” — that is, Nvidia is benefiting from spending that ultimately won’t make much sense from the perspective of its customers, and is poised to retrench sharply once they figure that out.

Huang makes the argument that there’s no irrationality here because of the applications AI already has as well as the fresh opportunities it unlocks. In short, he’s saying the big spenders have many reasons to spend big.

Success stories

And to justify that, management talked up their customers’ wins.

“RBC is leveraging agentic AI to drive significant analyst productivity, slashing report generation time from hours to minutes. AI and digital twins are helping Unilever accelerate content creation by 2x and cut costs by 50%. And Salesforce’s engineering team has seen at least 30% productivity increase in new code development after adopting Cursor.” –Colette Kress

What really jumped out, however, was the CEO’s shout-outs to Meta. Mark Zuckerberg’s company has been a major laggard in the AI space as of late. Like other hyperscalers, it has a massive capital expenditure budget, but unlike that group, it doesn’t have a cloud business. That is, its spending is more “downstream” in nature than its peers; it relies on AI to make money, not on someone else wanting AI compute to make money.

“Meta’s GEM, a foundation model for ad recommendations trained on large-scale GPU clusters, exemplifies this shift. In Q2, Meta reported over a 5% increase in ad conversions on Instagram and 3% gain on Facebook feed, driven by generative AI-based GEM.” –Jensen Huang

His underlying message: everyone’s AI spending pays dividends, even if the market isn’t rewarding it at this moment.

Burry buried

Michael Burry of “The Big Short” fame recently raised concerns about whether Nvidia’s customers are understating depreciation, arguing that the GPUs they’ve bought should be losing value faster than the balance sheets of these buyers suggest.

Nvidia’s CFO came not to praise Burry, but to bury him:

“The long useful life of Nvidia’s CUDA GPUs is a significant TCO [total cost of ownership] advantage over accelerators. CUDA’s compatibility and our massive installed base extend the life of NVIDIA systems well beyond their original estimated useful life.

For more than two decades, we have optimized the CUDA ecosystem, improving existing workloads, accelerating new ones, and increasing throughput with every software release. Most accelerators without CUDA and Nvidia’s time-tested and versatile architecture became obsolete within a few years as model technologies evolve. Thanks to CUDA, the A100 GPUs we shipped six years ago are still running at full utilization today, powered by vastly improved software stack.” –Colette Kress

As an aside, the idea that Nvidia’s chips remain very useful for a long time is something that, on the surface, seems much better for Nvidia’s customers than its sales outlook, but it’s really hard to nitpick that given how much demand is in the pipeline.

Growth runway

One nagging fear about this reporting period was that Nvidia had already given investors the good news: in late October, Huang said the company already had more than $500 billion in orders for its flagship chips through 2026.

When a company is growing this much, this fast, it’s reasonable to ask questions about how much more of an appetite there is out there to be sated.

And Nvidia has answers, both on how big it expects the AI market to get and how much demand it thinks it’ll realize in the near term.

“We believe Nvidia will be the superior choice for the $3 trillion to $4 trillion in annual AI infrastructure build we estimate by the end of the decade.” –Colette Kress

“For example, just even today, our announcements with KSA, and that agreement in itself is 400,000 to 600,000 more GPUs over three years. Anthropic is also net new. So there’s definitely an opportunity for us to have more on top of the $500 billion that we announced.” –Colette Kress

Off the chain

Having all this demand is one thing; meeting it is another. Such worries have been in ascendance with memory chip price hikes abound and Huang recently asking TSMC to boost production.

Its Blackwell ramp was not necessarily seamless. But...

“Our ecosystem will be ready for a fast Rubin ramp.” –Colette Kress

Nvidia’s answer, in short, is that its supply chains are decades in the making, and everyone wants to work with the leader in the space.

“Our supply chain has been working with us for a very long time. And so in many cases, we’ve secured a lot of supply for ourselves, because obviously, they’re working with the largest company in the world in doing so.” –Jensen Huang

“The supply chain, we have much better visibility and control over it, because obviously we’re incredibly good at managing our supply chain. We have great partners that we’ve worked with for 33 years. And so the supply chain part of it, we’re quite confident. Now looking down our supply chain, we’ve now established partnerships with so many players in land and power and shell, and of course financing. These things — none of these things are easy, but they’re all tractable and they’re all solvable things. And the most important thing that we have to do is do a good job planning. We plan up the supply chain, down the supply chain. We’ve established a whole lot of partners. And so we have a lot of routes to market.” –Jensen Huang

Of marginal concern

Meeting high demand at a time of increasing pressures up and down the supply chain had some analysts worried about the outlook for Nvidia’s profitability.

Expect more of the same, management said.

“Earlier this year, we indicated that through cost improvements and mix that we would exit the year in our gross margins in the mid-70s. We achieved that and getting ready to also execute that in Q4. So now it’s time for us to communicate where are we working right now in terms of next year. Next year, there are input prices that are well known in industries that we need to work through... So we’re taking all of that into account, but we do believe if we look at working again on cost improvements, cycle time, and mix, that we will work to try and hold at our gross margins in the mid-70s.” –Colette Kress

First among unequals

The last question Nvidia faced on the conference call related to the competitive threat posed by custom chips (or ASICs). Google’s recently released Gemini 3 model, for instance, was trained using its in-house TPU chips, and offers some cost advantages, particularly when it comes to the price of inputting tokens, as well providing power efficiencies.

While Huang was serving as an ambassador for AI, he’s of course first and foremost an advocate for Nvidia’s AI solutions.

He didn’t take on the question of GPUs vs. ASICs directly. Instead, he offered the following arguments in favor of Nvidia-centric systems.

“Back in the Hopper day and the Ampere days, we would build one GPU. That’s the definition of an accelerated AI system. But today, we’ve got to build entire racks, entire three different types of switches, a scale-up, a scale-out, and a scale-across switch. And it takes a lot more than one chip to build a compute node anymore.” –Jensen Huang

Translation: we’re not in Kansas anymore. We’re not just focused on making the best chip, but also the best total package.

And when wrapping his list of the five things that make Nvidia special, Huang said:

“The most important thing, the fifth thing, is if you are a cloud service provider, if you’re a new company like Humain, if you’re a new company like CoreWeave or Nscale or Nebius, or OCI for that matter, the reason why Nvidia is the best platform for you is because our offtake is so diverse. We can help you with offtake. It’s not about just putting a random ASIC into a data center.”

Simply, it’s easier to sell capacity using Nvidia’s architecture because its CUDA software is ubiquitous in high-performance computing.

More Markets

See all Markets
markets
Luke Kawa

Trump Media jumps after announcing plans to distribute digital tokens to shareholders

Trump Media & Technology Group is jumping in premarket trading after the owner of Truth Social announced plans to distribute a digital token to shareholders in partnership with Crypto.com (which is also its partner in the event contracts space).

Shareholders will receive one token per share owned, according to the press release, which can give the holder access to “various rewards” that “may include benefits or discounts tied to Trump Media products.”

This move is a little closer to home for Trump Media, which has effectively been a digital asset treasury, compared to its recent merger with fusion energy company TAE Technologies, which will radically transform the entity.

markets
Luke Kawa

Nvidia, TSMC rise as the world’s most valuable company reportedly asks for more chips to meet Chinese demand

Nvidia and TSMC are modestly higher in premarket trading Wednesday after Reuters reported that the chip designer asked the Taiwanese chip manufacturing giant to boost production of its H200 AI chips.

Earlier this month, US President Donald Trump said that Nvidia would be able to ship the best-performing processors from its Hopper generation to China, with 25% of the proceeds going to the US government. Per the report, Chinese companies have already placed orders for more than 2 million of these chips in 2026, roughly triple the 700,000 in inventory that Nvidia has in reserve. Reuters added that Nvidia is planning on selling these chips at around $27,000 apiece, which would amount to a more than $54 billion boost in revenues if it’s able to realize all this reported demand. The ability to do so will also depend on Chinese regulators green-lighting purchases. The chip designer’s success in 2025 has come despite being effectively shut out of the Chinese AI market for the year.

The outlet previously reported that Nvidia plans to begin sending these GPUs to China before the Lunar New Year holiday (which starts on February 17, 2026), and that Chinese companies are eagerly awaiting the opportunity to get their hands on these powerful chips.

During Nvidia’s Q3 conference call, which came prior to the Trump announcement, CEO Jensen Huang expressed confidence in his ability to meet demand for the company’s GPUs going forward, saying, “In many cases, we’ve secured a lot of supply for ourselves, because obviously, they’re working with the largest company in the world in doing so.”

Huang’s relationship with critical supply chain partner TSMC appears to benefit from a personal touch: during his November visit to Taiwan, he met with the chipmaker’s CEO, CC Wei, as well as other execs over hot pot, and called TSMC “the pride of the world” the next day.

markets
Luke Kawa

Nike rises after CEO Elliott Hill purchases $1 million in company stock

Nike is sprinting to the finish line in 2025, up more than 2% in premarket trading after a filing after the close on Tuesday showed that CEO Elliott Hill purchased a little over $1 million in company stock on December 29.

The news comes on the heels of last week’s revelation that Apple CEO and board member Tim Cook bought nearly $3 million in Nike stock.

Hill returned to the company to replace former CEO John Donahoe in October 2024. This is Hill’s only open market purchase of Nike stock during his tenure atop the company.

Shares of the sports apparel maker are still down about 17% year to date.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.