Markets
Vince Carter
Vince Carter of the US leaps over Frederic Weis of France to dunk (Darren McNamara/Getty Images)

Nvidia dunks on the doubters

CEO Jensen Huang and CFO Colette Kress dismantled most of the recent arguments and bear cases put forward by their naysayers.

Nvidia’s Q3 results and Q4 outlook provided an emphatic statement that speaks for itself: it’s still boom times for the company at the heart of AI.

And while actions (and numbers!) may speak louder than words, the Q3 conference call offered plenty to chew on. During both prepared remarks and the Q&A, CEO Jensen Huang and CFO Colette Kress systematically addressed and dissected most of the recent arguments and bear cases put forward by their naysayers — whether it was brought up in a question or not.

There were flexes galore.

Huang spoke not only as the CEO of the world’s largest company, but also as an ambassador for AI, justifying the immense spending that benefits his firm by pointing to the rewards he believes his customers will reap.

This is the kind of conference call that will either have people revisiting some of these quotes and going, “I should have known this would be the first $6 trillion company,” or, “All this hubris was such a big tell that the AI trade was doomed.”

Or, depending on how much of a sense of humor the market gods have, both!

AI bubble?

“There’s been a lot of talk about an AI bubble. From our vantage point, we see something very different… The transition to accelerated computing is foundational and necessary, essential in a post-Moore’s Law era. The transition to generative AI is transformational and necessary, supercharging existing applications and business models. And the transition to agentic and physical AI will be revolutionary, giving rise to new applications, companies, products and services.” –Jensen Huang

Nvidia putting up massive sales numbers is not, in and of itself, proof in favor of or against an AI bubble.

A bubble needs irrationality, whether that be in valuations or earnings. Nvidia came into this report trading at its lowest valuation relative to the S&P 500 since June (a forward price-to-earnings premium of less than 13%). So, the valuation bubble argument isn’t of particular relevance to Nvidia at this juncture. Of greater concern is the potential for an “earnings bubble” — that is, Nvidia is benefiting from spending that ultimately won’t make much sense from the perspective of its customers, and is poised to retrench sharply once they figure that out.

Huang makes the argument that there’s no irrationality here because of the applications AI already has as well as the fresh opportunities it unlocks. In short, he’s saying the big spenders have many reasons to spend big.

Success stories

And to justify that, management talked up their customers’ wins.

“RBC is leveraging agentic AI to drive significant analyst productivity, slashing report generation time from hours to minutes. AI and digital twins are helping Unilever accelerate content creation by 2x and cut costs by 50%. And Salesforce’s engineering team has seen at least 30% productivity increase in new code development after adopting Cursor.” –Colette Kress

What really jumped out, however, was the CEO’s shout-outs to Meta. Mark Zuckerberg’s company has been a major laggard in the AI space as of late. Like other hyperscalers, it has a massive capital expenditure budget, but unlike that group, it doesn’t have a cloud business. That is, its spending is more “downstream” in nature than its peers; it relies on AI to make money, not on someone else wanting AI compute to make money.

“Meta’s GEM, a foundation model for ad recommendations trained on large-scale GPU clusters, exemplifies this shift. In Q2, Meta reported over a 5% increase in ad conversions on Instagram and 3% gain on Facebook feed, driven by generative AI-based GEM.” –Jensen Huang

His underlying message: everyone’s AI spending pays dividends, even if the market isn’t rewarding it at this moment.

Burry buried

Michael Burry of “The Big Short” fame recently raised concerns about whether Nvidia’s customers are understating depreciation, arguing that the GPUs they’ve bought should be losing value faster than the balance sheets of these buyers suggest.

Nvidia’s CFO came not to praise Burry, but to bury him:

“The long useful life of Nvidia’s CUDA GPUs is a significant TCO [total cost of ownership] advantage over accelerators. CUDA’s compatibility and our massive installed base extend the life of NVIDIA systems well beyond their original estimated useful life.

For more than two decades, we have optimized the CUDA ecosystem, improving existing workloads, accelerating new ones, and increasing throughput with every software release. Most accelerators without CUDA and Nvidia’s time-tested and versatile architecture became obsolete within a few years as model technologies evolve. Thanks to CUDA, the A100 GPUs we shipped six years ago are still running at full utilization today, powered by vastly improved software stack.” –Colette Kress

As an aside, the idea that Nvidia’s chips remain very useful for a long time is something that, on the surface, seems much better for Nvidia’s customers than its sales outlook, but it’s really hard to nitpick that given how much demand is in the pipeline.

Growth runway

One nagging fear about this reporting period was that Nvidia had already given investors the good news: in late October, Huang said the company already had more than $500 billion in orders for its flagship chips through 2026.

When a company is growing this much, this fast, it’s reasonable to ask questions about how much more of an appetite there is out there to be sated.

And Nvidia has answers, both on how big it expects the AI market to get and how much demand it thinks it’ll realize in the near term.

“We believe Nvidia will be the superior choice for the $3 trillion to $4 trillion in annual AI infrastructure build we estimate by the end of the decade.” –Colette Kress

“For example, just even today, our announcements with KSA, and that agreement in itself is 400,000 to 600,000 more GPUs over three years. Anthropic is also net new. So there’s definitely an opportunity for us to have more on top of the $500 billion that we announced.” –Colette Kress

Off the chain

Having all this demand is one thing; meeting it is another. Such worries have been in ascendance with memory chip price hikes abound and Huang recently asking TSMC to boost production.

Its Blackwell ramp was not necessarily seamless. But...

“Our ecosystem will be ready for a fast Rubin ramp.” –Colette Kress

Nvidia’s answer, in short, is that its supply chains are decades in the making, and everyone wants to work with the leader in the space.

“Our supply chain has been working with us for a very long time. And so in many cases, we’ve secured a lot of supply for ourselves, because obviously, they’re working with the largest company in the world in doing so.” –Jensen Huang

“The supply chain, we have much better visibility and control over it, because obviously we’re incredibly good at managing our supply chain. We have great partners that we’ve worked with for 33 years. And so the supply chain part of it, we’re quite confident. Now looking down our supply chain, we’ve now established partnerships with so many players in land and power and shell, and of course financing. These things — none of these things are easy, but they’re all tractable and they’re all solvable things. And the most important thing that we have to do is do a good job planning. We plan up the supply chain, down the supply chain. We’ve established a whole lot of partners. And so we have a lot of routes to market.” –Jensen Huang

Of marginal concern

Meeting high demand at a time of increasing pressures up and down the supply chain had some analysts worried about the outlook for Nvidia’s profitability.

Expect more of the same, management said.

“Earlier this year, we indicated that through cost improvements and mix that we would exit the year in our gross margins in the mid-70s. We achieved that and getting ready to also execute that in Q4. So now it’s time for us to communicate where are we working right now in terms of next year. Next year, there are input prices that are well known in industries that we need to work through... So we’re taking all of that into account, but we do believe if we look at working again on cost improvements, cycle time, and mix, that we will work to try and hold at our gross margins in the mid-70s.” –Colette Kress

First among unequals

The last question Nvidia faced on the conference call related to the competitive threat posed by custom chips (or ASICs). Google’s recently released Gemini 3 model, for instance, was trained using its in-house TPU chips, and offers some cost advantages, particularly when it comes to the price of inputting tokens, as well providing power efficiencies.

While Huang was serving as an ambassador for AI, he’s of course first and foremost an advocate for Nvidia’s AI solutions.

He didn’t take on the question of GPUs vs. ASICs directly. Instead, he offered the following arguments in favor of Nvidia-centric systems.

“Back in the Hopper day and the Ampere days, we would build one GPU. That’s the definition of an accelerated AI system. But today, we’ve got to build entire racks, entire three different types of switches, a scale-up, a scale-out, and a scale-across switch. And it takes a lot more than one chip to build a compute node anymore.” –Jensen Huang

Translation: we’re not in Kansas anymore. We’re not just focused on making the best chip, but also the best total package.

And when wrapping his list of the five things that make Nvidia special, Huang said:

“The most important thing, the fifth thing, is if you are a cloud service provider, if you’re a new company like Humain, if you’re a new company like CoreWeave or Nscale or Nebius, or OCI for that matter, the reason why Nvidia is the best platform for you is because our offtake is so diverse. We can help you with offtake. It’s not about just putting a random ASIC into a data center.”

Simply, it’s easier to sell capacity using Nvidia’s architecture because its CUDA software is ubiquitous in high-performance computing.

More Markets

See all Markets
markets

Broadcom jumps after locking down Google as a customer for future generations of TPUs

Shares of Broadcom rose more than 3% in postmarket trading on Monday after its most important customer doubled down on the custom chip specialist’s ability to produce its most valuable commodity.

In a filing, Broadcom said that it entered into a long-term agreement with Google to supply future generations of TPUs (custom AI accelerator chips) as well as a supply assurance agreement for networking and other equipment “through up to 2031.”

Bernstein analyst Stacy Rasgon indicated that Broadcom’s investor relations team told him that Google’s long-term agreement “has revenue commitments that go along with it through the timeline.”

Gemini 3 launched to rave reviews in November. The model was trained on TPUs co-developed by Broadcom and Google.

The same Monday filing showed that Broadcom, Google, and Anthropic expanded a partnership that will see the Claude developer access 3.5 gigawatts of AI compute capacity beginning in 2027, powered by the TPUs co-designed by the custom chip specialist and the search giant.

Bernstein’s Rasgon added that Broadcom’s team suggested these 3.5 gigawatts are “only part of a larger partnership over time.” He thinks Broadcom’s fiscal year 2027 guidance for AI revenues of $100 billion “is looking increasingly light” thanks to this news.

For what it’s worth, the enhanced pact with Anthropic hinges upon the firm’s ability to afford AI compute. But based on the insane trajectory of its run-rate revenue that may not be a big hurdle to clear.

“Broadcom’s expanded agreements with Google and Anthropic add rare multi-year visibility, reinforcing a $40-$50 billion AI revenue opportunity tied to Anthropic’s 3.5 gigawatt deployment starting in 2027, while building on the previously disclosed 1GW ($10 billion) starting in 2H,” wrote Bloomberg Intelligence analysts Kunjan Sobhani and Oscar Hernandez Tejada.

markets

Health insurers surge after Medicare agrees to pay 2.48% more in 2027, a bigger-than-expected boost

Health insurance stocks are surging after the Centers for Medicare & Medicaid Services said it plans to boost Medicare Advantage and Part D payments by 2.48% in calendar year 2027.

The likes of CVS, Humana, UnitedHealth, Molina Healthcare, Oscar Health, and Elevance Health are gaining in postmarket trading.

Wall Street analysts had anticipated that rates for 2027 would go up between roughly 1% and 1.5%.

These stocks had gotten crushed in late January when the Trump administration proposed relatively flat federal payment rates.

Insurance companies that provide government-sponsored plans, like Medicare Advantage, faced headwinds from higher-than-expected costs in 2025.

markets

Iran war winners Dow, LyondellBasell downgraded by Bank of America

Dow, Inc. and LyondellBasell — two petrochemicals stocks that surged as markets priced in shortages due to the closure of the Strait of Hormuz — should decline as investors focus on the long-term outlook for normalized petrochemical prices once the war resolves, Bank of America analysts wrote in a note downgrading the two stocks Monday.

BofA moved its rating on the shares from “neutral” to “underperform,” writing:

“Over time, as chemical markets normalize, we expect 1) investor focus to shiſt back to ‘normal’ or ‘sustainable’ earnings profiles and 2) the conflict to resolve without material asset rationalization, both of which likely bias shares lower over the next twelve months.”

Analysts also lowered their stance on another petrochemicals and building materials stock, Westlake, to “neutral” from “buy.”

While cutting those ratings, BofA actually raised its more near-term price targets for the shares. It upped LyondellBasell to $68 from $55, and Dow to $35 from $31.

But those price targets still imply declines of more than 10% compared to where both shares were trading late Monday morning. Both stocks are up roughly 30% since the start of the Iran war.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, Robinhood Derivatives, LLC, or Robinhood Money, LLC. Futures and event contracts are offered through Robinhood Derivatives, LLC.