Markets
Vince Carter
Vince Carter of the US leaps over Frederic Weis of France to dunk (Darren McNamara/Getty Images)

Nvidia dunks on the doubters

CEO Jensen Huang and CFO Colette Kress dismantled most of the recent arguments and bear cases put forward by their naysayers.

Nvidia’s Q3 results and Q4 outlook provided an emphatic statement that speaks for itself: it’s still boom times for the company at the heart of AI.

And while actions (and numbers!) may speak louder than words, the Q3 conference call offered plenty to chew on. During both prepared remarks and the Q&A, CEO Jensen Huang and CFO Colette Kress systematically addressed and dissected most of the recent arguments and bear cases put forward by their naysayers — whether it was brought up in a question or not.

There were flexes galore.

Huang spoke not only as the CEO of the world’s largest company, but also as an ambassador for AI, justifying the immense spending that benefits his firm by pointing to the rewards he believes his customers will reap.

This is the kind of conference call that will either have people revisiting some of these quotes and going, “I should have known this would be the first $6 trillion company,” or, “All this hubris was such a big tell that the AI trade was doomed.”

Or, depending on how much of a sense of humor the market gods have, both!

AI bubble?

“There’s been a lot of talk about an AI bubble. From our vantage point, we see something very different… The transition to accelerated computing is foundational and necessary, essential in a post-Moore’s Law era. The transition to generative AI is transformational and necessary, supercharging existing applications and business models. And the transition to agentic and physical AI will be revolutionary, giving rise to new applications, companies, products and services.” –Jensen Huang

Nvidia putting up massive sales numbers is not, in and of itself, proof in favor of or against an AI bubble.

A bubble needs irrationality, whether that be in valuations or earnings. Nvidia came into this report trading at its lowest valuation relative to the S&P 500 since June (a forward price-to-earnings premium of less than 13%). So, the valuation bubble argument isn’t of particular relevance to Nvidia at this juncture. Of greater concern is the potential for an “earnings bubble” — that is, Nvidia is benefiting from spending that ultimately won’t make much sense from the perspective of its customers, and is poised to retrench sharply once they figure that out.

Huang makes the argument that there’s no irrationality here because of the applications AI already has as well as the fresh opportunities it unlocks. In short, he’s saying the big spenders have many reasons to spend big.

Success stories

And to justify that, management talked up their customers’ wins.

“RBC is leveraging agentic AI to drive significant analyst productivity, slashing report generation time from hours to minutes. AI and digital twins are helping Unilever accelerate content creation by 2x and cut costs by 50%. And Salesforce’s engineering team has seen at least 30% productivity increase in new code development after adopting Cursor.” –Colette Kress

What really jumped out, however, was the CEO’s shout-outs to Meta. Mark Zuckerberg’s company has been a major laggard in the AI space as of late. Like other hyperscalers, it has a massive capital expenditure budget, but unlike that group, it doesn’t have a cloud business. That is, its spending is more “downstream” in nature than its peers; it relies on AI to make money, not on someone else wanting AI compute to make money.

“Meta’s GEM, a foundation model for ad recommendations trained on large-scale GPU clusters, exemplifies this shift. In Q2, Meta reported over a 5% increase in ad conversions on Instagram and 3% gain on Facebook feed, driven by generative AI-based GEM.” –Jensen Huang

His underlying message: everyone’s AI spending pays dividends, even if the market isn’t rewarding it at this moment.

Burry buried

Michael Burry of “The Big Short” fame recently raised concerns about whether Nvidia’s customers are understating depreciation, arguing that the GPUs they’ve bought should be losing value faster than the balance sheets of these buyers suggest.

Nvidia’s CFO came not to praise Burry, but to bury him:

“The long useful life of Nvidia’s CUDA GPUs is a significant TCO [total cost of ownership] advantage over accelerators. CUDA’s compatibility and our massive installed base extend the life of NVIDIA systems well beyond their original estimated useful life.

For more than two decades, we have optimized the CUDA ecosystem, improving existing workloads, accelerating new ones, and increasing throughput with every software release. Most accelerators without CUDA and Nvidia’s time-tested and versatile architecture became obsolete within a few years as model technologies evolve. Thanks to CUDA, the A100 GPUs we shipped six years ago are still running at full utilization today, powered by vastly improved software stack.” –Colette Kress

As an aside, the idea that Nvidia’s chips remain very useful for a long time is something that, on the surface, seems much better for Nvidia’s customers than its sales outlook, but it’s really hard to nitpick that given how much demand is in the pipeline.

Growth runway

One nagging fear about this reporting period was that Nvidia had already given investors the good news: in late October, Huang said the company already had more than $500 billion in orders for its flagship chips through 2026.

When a company is growing this much, this fast, it’s reasonable to ask questions about how much more of an appetite there is out there to be sated.

And Nvidia has answers, both on how big it expects the AI market to get and how much demand it thinks it’ll realize in the near term.

“We believe Nvidia will be the superior choice for the $3 trillion to $4 trillion in annual AI infrastructure build we estimate by the end of the decade.” –Colette Kress

“For example, just even today, our announcements with KSA, and that agreement in itself is 400,000 to 600,000 more GPUs over three years. Anthropic is also net new. So there’s definitely an opportunity for us to have more on top of the $500 billion that we announced.” –Colette Kress

Off the chain

Having all this demand is one thing; meeting it is another. Such worries have been in ascendance with memory chip price hikes abound and Huang recently asking TSMC to boost production.

Its Blackwell ramp was not necessarily seamless. But...

“Our ecosystem will be ready for a fast Rubin ramp.” –Colette Kress

Nvidia’s answer, in short, is that its supply chains are decades in the making, and everyone wants to work with the leader in the space.

“Our supply chain has been working with us for a very long time. And so in many cases, we’ve secured a lot of supply for ourselves, because obviously, they’re working with the largest company in the world in doing so.” –Jensen Huang

“The supply chain, we have much better visibility and control over it, because obviously we’re incredibly good at managing our supply chain. We have great partners that we’ve worked with for 33 years. And so the supply chain part of it, we’re quite confident. Now looking down our supply chain, we’ve now established partnerships with so many players in land and power and shell, and of course financing. These things — none of these things are easy, but they’re all tractable and they’re all solvable things. And the most important thing that we have to do is do a good job planning. We plan up the supply chain, down the supply chain. We’ve established a whole lot of partners. And so we have a lot of routes to market.” –Jensen Huang

Of marginal concern

Meeting high demand at a time of increasing pressures up and down the supply chain had some analysts worried about the outlook for Nvidia’s profitability.

Expect more of the same, management said.

“Earlier this year, we indicated that through cost improvements and mix that we would exit the year in our gross margins in the mid-70s. We achieved that and getting ready to also execute that in Q4. So now it’s time for us to communicate where are we working right now in terms of next year. Next year, there are input prices that are well known in industries that we need to work through... So we’re taking all of that into account, but we do believe if we look at working again on cost improvements, cycle time, and mix, that we will work to try and hold at our gross margins in the mid-70s.” –Colette Kress

First among unequals

The last question Nvidia faced on the conference call related to the competitive threat posed by custom chips (or ASICs). Google’s recently released Gemini 3 model, for instance, was trained using its in-house TPU chips, and offers some cost advantages, particularly when it comes to the price of inputting tokens, as well providing power efficiencies.

While Huang was serving as an ambassador for AI, he’s of course first and foremost an advocate for Nvidia’s AI solutions.

He didn’t take on the question of GPUs vs. ASICs directly. Instead, he offered the following arguments in favor of Nvidia-centric systems.

“Back in the Hopper day and the Ampere days, we would build one GPU. That’s the definition of an accelerated AI system. But today, we’ve got to build entire racks, entire three different types of switches, a scale-up, a scale-out, and a scale-across switch. And it takes a lot more than one chip to build a compute node anymore.” –Jensen Huang

Translation: we’re not in Kansas anymore. We’re not just focused on making the best chip, but also the best total package.

And when wrapping his list of the five things that make Nvidia special, Huang said:

“The most important thing, the fifth thing, is if you are a cloud service provider, if you’re a new company like Humain, if you’re a new company like CoreWeave or Nscale or Nebius, or OCI for that matter, the reason why Nvidia is the best platform for you is because our offtake is so diverse. We can help you with offtake. It’s not about just putting a random ASIC into a data center.”

Simply, it’s easier to sell capacity using Nvidia’s architecture because its CUDA software is ubiquitous in high-performance computing.

More Markets

See all Markets
markets

Carvana craters after Q4 earnings miss estimates

Used car retailer Carvana plummeted after fourth-quarter profits came in shy of estimates.

Adjusted EBITDA of $511 million came in below the consensus call for $535.7 million, more than offsetting better-than-expected sales of $5.6 billion (estimate: $5.27 billion).

Carvana sold 163,522 used vehicles to retail customers in the quarter, up 43% from last year and ahead of expectations. With that result, Carvana further closes its retail sales gap with rival CarMax, which sold 169,557 vehicles in its most recent quarter.

Carvana posted a retail gross profit per vehicle of $3,076, down 7.7% from the same period last year. In a letter to shareholders, Carvana said its reconditioning costs came in higher than expected in Q4, which led to an additional impact on retail gross profit per unit. Lower shipping fee revenue, higher non-vehicle costs, and higher industrywide retail depreciation rates also drove the decline, the company said.

Carvana said it expects to see elevated reconditioning costs again in the first quarter, but expects a sequential increase in retail GPU. Carvana said it expects “significant growth in both retail units sold and Adjusted EBITDA” in the first quarter and full year ahead.

As of Wednesday’s close, Carvana shares were down about 24% since an all-time closing high in January, after a report from short seller Gotham City questioning its accounting practices sent the stock reeling. A Carvana spokesperson told Sherwood News that the report was “inaccurate and intentionally misleading.”

markets

DoorDash reports earnings miss, underwhelming earnings guidance

DoorDash reported earnings results that missed Wall Street expectations and provided underwhelming earnings guidance Wednesday after the bell, which it attributed to harsh weather and increased spending.

For the final three months of 2025, DoorDash reported:

  • Earnings per share of $0.48, compared to the $0.59 analysts polled by FactSet were expecting.

  • Revenue of $3.9 billion, in line with the $3.9 billion analysts were penciling in.

  • Gross order value (the total amount spent on the platform) of $29.7 billion, compared to the $29.2 billion analysts were expecting.

For the current quarter, the company expects:

  • GOV between $31.0 billion and $31.8 billion, versus the $30.7 billion analysts are expecting.

  • Adjusted EBITDA between $675 million and $775 million, far below the $801.9 million analysts are expecting. The company said spending on Deliveroo, its recent UK acquisition, as well as extreme winter weather in the US are weighing on its profit guidance.

Shares fell as much as 11% in after-hours trading. The stock is down more than 20% so far this year.

DoorDash’s costs have gone up as it ramps up investment in autonomous delivery and international expansion, among other things. “This is a massive and expensive undertaking and honestly one you shouldn’t do if you thought your best days were behind you,” CEO Tony Xu said in a letter to shareholders.

Ethan Feller, a strategist at Zacks Investment Research, said the underlying business remains strong even if the stock faces pressure in the near term.

“None of these are structural issues, but soft guidance is soft guidance — and the market rarely gives credit for context when a stock is already under pressure,” he said.

markets

Figma spikes after reporting better-than-expected Q4 results, blowout Q1 and full-year sales guidance

Figma reported Q4 results that exceeded Wall Street’s expectations and robust sales guidance for the current quarter and full year.

Shares are spiking in after-hours trading.

For the final three months of 2025, the digital design and development platform company reported:

  • Revenue of $303.8 million, compared to the $293.1 million analysts were penciling in.

  • Adjusted earnings per share of $0.08, compared to the $0.07 analysts polled by Bloomberg expected.

For sales, management expects:

  • Q1 revenue between $315 million and $317 million (estimate: $293.6 million).

  • Full-year revenue between $1.366 billion and $1.374 billion (estimate: $1.29 billion).

The lower ends of these ranges are above the highest analyst sales estimates for both Q1 and 2026 as a whole.

This marks the company’s second earnings report since going public over the summer. Its share price has taken a hit this year alongside many of its software peers, and management will be looking to show that AI can be an accelerant, rather than a threat, to its business. On Tuesday, Figma announced a partnership with Anthropic to integrate AI coding tools.

“Our healthy balance sheet and positive free cash flow gives us the flexibility to continue investing in AI and the platform while maintaining financial discipline for sustainable, long-term growth,” CFO Praveer Melwani said in the press release.

As of the close on Wednesday, the stock was down 35% for the year and roughly 80% below its closing level at the time of its July IPO.

markets

Record labels dip as Google adds AI music generation to its Gemini app

Google on Wednesday said it’s rolling out the ability for Gemini app users aged 18 and up to generate 30-second AI music tracks.

The tool is available globally, as Google launches beta access to its Lyria 3 generative-AI music model.

Addressing the potential for skirting the lines of copyright law (as seen in other recent DeepMind AI tools), Google said:

“If your prompt names a specific artist, Gemini will take this as broad creative inspiration and create a track that shares a similar style or mood. We also have filters in place to check outputs against existing content. We recognize that our approach might not be foolproof, so you can report content that may violate your rights or the rights of others.”

Shares of record labels including Universal Music Group and Warner Music dropped 2% on the news. Spotify briefly dipped before rebounding, and Sony shares also saw a slight decline.

Last month, Morgan Stanley published a survey that found up to 60% of Gen Z respondents listen to AI music, for an average of three hours per week. Earlier this year, Bandcamp banned all music wholly or substantially generated using AI.

Addressing the potential for skirting the lines of copyright law (as seen in other recent DeepMind AI tools), Google said:

“If your prompt names a specific artist, Gemini will take this as broad creative inspiration and create a track that shares a similar style or mood. We also have filters in place to check outputs against existing content. We recognize that our approach might not be foolproof, so you can report content that may violate your rights or the rights of others.”

Shares of record labels including Universal Music Group and Warner Music dropped 2% on the news. Spotify briefly dipped before rebounding, and Sony shares also saw a slight decline.

Last month, Morgan Stanley published a survey that found up to 60% of Gen Z respondents listen to AI music, for an average of three hours per week. Earlier this year, Bandcamp banned all music wholly or substantially generated using AI.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.