Markets
A 'Google AI' branding is lit up with LED light at their...
(Ashish Vaishnav/Getty Images)

What the reaction to Google’s potential entry into selling chips tells us about the AI trade

Margins are meant to be attacked, customer quality matters, and focus on the size of the pie.

Luke Kawa

A report from The Information suggesting that Google is in talks to begin selling its custom TPU chips to Meta has spurred huge market moves and a ton of commentary about what this means for the major players in the AI trade.

Here’s my contribution:

Huge margins are a “kick me” sign

Posting persistently massive profit margins is akin to holding up a big sign to the world saying, “You should be in this business, too!”

New entrants into a market generally tend to reduce pricing power for dominant incumbents.

That’s easier said than done when it comes to developing high-powered chips to bolster the growth of a technology that was more science fiction than consumer-facing as recently as three years ago.

The AI boom is marked by both its scale and urgency. Hyperscalers will give you a litany of reasons to invest, from the pursuit of artificial general intelligence (though you don’t hear as much about that these days!) to the fear of having their existing leading market positions diminished by competitors that are willing to make these large outlays. 

Urgency means you’re willing to pay up for inputs, so if there were a time to be selling AI chips, now would be a good time. That’s the signal from Nvidia, whose adjusted gross margin has mostly been around the mid-70s over the past two years, with the exception of its fiscal Q1 2026.

Perhaps counterintuitively, given the positive market reaction to the reported talks with Meta, for Google, selling TPUs is margin negative compared to renting out access to the computing power provided by those same chips.

But a deal to sell TPUs to Meta would have benefits that could potentially outweigh the drag on margins. It would let the company gain a foothold among customers who would otherwise not use these TPUs and instead run AI tasks on a competitor’s cloud. Striking while the iron is hot would also likely help Google ensure that its software increases in prominence among the developer community — a key contributor to Nvidia’s moat.

Big endorsements matter

If these reports bear fruit, what will have been important is not just the fact that Google is selling its TPUs, but who it’s selling them to. That Meta is in talks to buy TPUs provides the second strong validation point for their quality in just the last week: Gemini 3 was already a testament to their capabilities, and now they are reportedly closing in on an additional stamp of approval from Mark Zuckerberg.

This is a pattern we’ve seen this before: after AMD went parabolic on the heels of its megadeal with OpenAI, Wedbush Securities analyst Dan Ives said that this was a “huge vote of confidence” and that “any lingering fears around AMD should now be thrown out the window.”

...but customer quality matters, too

But alas, what appears to be getting thrown out the window now is Advanced Micro Devices. Per the market’s knee-jerk reaction, it is the single biggest loser of Google’s potential foray into selling AI chips.

Loosely, Nvidia is still presumed to get the lion’s share of the pie, but this raises the risk in traders’ eyes that AMD’s slice looks more like scraps.

A vote of confidence from OpenAI simply does not carry the same weight as validation from a trillion-dollar hyperscaler. At this point, if OpenAI hasn’t made an multibillion-dollar commitment to you, are you really an AI company?

Remember, Oracle peaked and rolled over once it was reported that its massive sales backlog was largely being fueled by OpenAI. Its stock price has gone one way since then, soon followed by its credit default swap spreads heading in the opposite direction.

Mini-DeepSeek

In some ways, this negative shock for some parts of the AI trade is an echo of the DeepSeek-induced freak-out.

Back in January, the emergence of this Chinese AI model raised the idea that you can do AI on the cheap, casting doubts over the wisdom of hundreds of billions in capex (and counting). Jevons Paradox — in this case, the idea that AI becoming cheaper would ultimately increase overall demand for compute — won the day.

Right now, a potential Google entry is being treated as a zero to negative sum event for AI chip designers.

Unless the cost savings of Google’s TPUs relative to any performance sacrifices versus GPUs are a game changer for the economics surrounding AI training, inference, and beyond, this probably isn’t what matters for investors in any of this, or even just people who hold index funds.

As we all soon gather to eat copious amounts of pie, I’ll remind you that the size of the pie is what really matters. And that will be driven by whether AI is or becomes sufficiently cheap to deploy at scale that it generates a sufficient return on investment for the companies making these major outlays — and whether their customers also see enough of a benefit from making use of this computing power.

That’s still a largely unanswered question. Time and again throughout this boom, we’ve seen different regimes dominate: the promise of tomorrow versus the realities of the quarterly corporate reporting cycle today.

Think beyond chips

What’s interesting to me is that while AI chips are clearly high in demand, they don’t appear to be the most binding constraint on the boom. Microsoft CEO Satya Nadella recently said his biggest problem today is “not a supply issue of chips; it’s actually the fact that I don’t have warm shelves to plug into.” Nvidia CEO Jensen Huang warned that “China is going to win the AI race” in part because of better access to power. And CoreWeave CEO Michael Intrator told analysts that “across the space,” the issue is a shortage of other physical infrastructure to support data center build-outs.

And in a supply-constrained AI world, it’s also fascinating that Google must feel it has the ability to get its hands on enough chips to satisfy not only its own computing needs, but for third parties, as well.

More Markets

See all Markets
Hims & Hers Big Game commercial

How GLP-1s elevated Hims — and brought it back down to earth

Here's a look at how the company's GLP-1 business has sent the stock on a wild ride

markets

Western Digital jumps ahead of Nvidia earnings report later today

Hard disk drive maker Western Digital is on track for one of its best days of the month Wednesday, on relatively little news.

Traders may be trying to get ahead of any expected share bump related to Nvidia’s earnings extravaganza after the close of trading today.

Western Digital saw its best day — up almost 17% — in almost six years in early January, after Nvidia CEO Jensen Huang’s keynote speech at the Consumer Electronics Show in Las Vegas underscored the surge in demand for memory that AI is producing.

Western Digital executives have previously talked up the fact that some of the company’s products are qualified for use in Nvidia server stacks.

Fellow hard disk drive maker Seagate Technology Holdings is also having one having one of its best days in February.

Western Digital saw its best day — up almost 17% — in almost six years in early January, after Nvidia CEO Jensen Huang’s keynote speech at the Consumer Electronics Show in Las Vegas underscored the surge in demand for memory that AI is producing.

Western Digital executives have previously talked up the fact that some of the company’s products are qualified for use in Nvidia server stacks.

Fellow hard disk drive maker Seagate Technology Holdings is also having one having one of its best days in February.

markets

Nvidia and AMD’s different deals show that while AI chatbots may be commoditized, the chips aren’t

One enigma I’m noticing in the AI boom?

The publicly available chatbots, effectively the best universal manifestation of artificial intelligence we have, feel more or less the same to me. That is, commoditized.

Maybe this is a skill issue; I’m not the most high-tech person. That being said, I have experienced substantial performance gaps between paid and free versions, and am aware that more specialized tools offer better tailored results for certain tasks (i.e. Claude Code). But still, I’m Gemini-first, but polyAImorous when it comes to chatbot usage.

Based on how Big Tech companies treat GPUs, the inputs used to train and run many chatbots, those seem to be anything but commoditized.

Two of the AI chip deals reached by Advanced Micro Devices, the No. 2 in GPUs, have involved the company forking over the rights to potentially massive equity stakes in the company in exchange for securing these buyers. First was OpenAI, then Tuesday’s pact with Meta.

Lisa Su and co. seemingly can’t get customers on normal terms the way Jensen Huang and co. can.

Nvidia, which reports earnings Wednesday after the close, enjoys a dominant market position. Sure, it subsidizes its customers’ acquisitions of chips, but it could be argued that this is just a way in investing in its own success by trying to make sure the company has as many viable future clients as possible. Nvidia and Meta’s “multi-year, multi-generational strategic partnership” that will see the social media giant buy millions of GPUs in the former didn’t involve the chip designer needing to give Mark Zuckerberg any potential equity exposure.

Nvidia’s offerings are able to command a significant premium because its hardware not only comes with a track record, but it’s also attached to the CUDA software system that AI developers are comfortable with.

In a sense, some of the best industry comps here are found in energy (something AI data centers chock-full of GPUs need a lot of!).

Different forms of crude can be refined into the same kind of gasoline; your car won’t know the difference. Similarly, hydropower, solar power, or natural gas can all be used to generate electricity, and as long as the lights are on, people won’t be able to tell which one it was.

Loading...
 
Loading...
 
1%

The vast majority of S&P 500 companies are talking about AI this earnings season, at least in broad strokes.

But precious few are disclosing hard numbers on how AI makes them more profitable, according to a review of Q4 earnings calls conducted by Goldman Sachs analysts.

In a note published late Tuesday, analysts with the bank found that just 1% of the members of the S&P 500 have “quantified the impact of AI on earnings.” That’s despite 70% of the blue-chip index’s members “broadly discussing AI” on earnings calls.

They wrote:

“In earnings calls, many companies have grouped AI with broader automation and productivity initiatives, making it difficult to disentangle the impact of AI specifically. 10% of S&P 500 companies quantified the productivity boost from an AI on a specific use case during their 4Q earnings calls, particularly among developers. Our economists recently highlighted softness in tech employment in recent months.”

Only two new companies quantified an AI productivity impact on their current earnings, Goldman found. One was financial analytics and ratings powerhouse S&P Global. The other was water treatment and commercial cleaning products manufacturer Ecolab.

The gap between the share of executives yapping about AI and the dearth of detail on the technology’s bottom-line impact may be part of the reason investors have gotten slightly jittery about AI during the recent flurry of earnings reports, with volatility on baskets of AI-related stocks picking up over the last month.

Investors know that tech giants are boosting the amount they plan to spend on AI investments in the coming year to over $600 billion. They also know that customers who will buy AI services from Amazon , Google, and Microsoft — to name a few — will have to see real benefits from using it.

If they don’t, the customers won’t keep paying. And there goes the hyperscalers’ ROI.

Anyway, we’re still waiting to see those benefits.

markets

Albemarle and fellow lithium miners jump on Zimbabwe export ban

Lithium miners Albemarle Corp., Lithium Argentina, Lithium Americas, Mineral Resourcesand Sociedad Quimica y Minera are all rising in early trading on Wednesday after Zimbabwe suspended its export of all raw minerals and lithium concentrates.

The latest move accelerates a ban that was reportedly expected to come into effect starting January 2027, and will remain in place until further notice. As the top African producer of the metal, Zimbabwe exported some 1.13 million metric tons, or $514 million worth, of lithium-bearing spodumene concentrate in 2025.

Zimbabwe’s mines minister said the ban would remain in effect until companies comply with government requirements, per Bloomberg.

The ban adds further supply pressure into a market that’s already seen prices squeeze higher, with benchmark lithium futures roughly doubling since the end of October 2025. That’s spurred the shares of companies like Albemarle, which has gained more than 90% over the same time frame — a rare bright spot in an EV supply chain that’s generally been pretty depressed recently.

Some Wall Street analysts have gotten more bullish on the sector, too. Albemarle scored an upgrade from Bank of America analysts earlier this month, who cited the lithium market’s improvement in structural fundamentals, as Chinese supply restrictions combined with growing demand for utility-scale battery storage applications provide further support for lithium prices. Futures prices for lithium carbonate remain above $18 per kilogram, having been as high as $21 per kilogram in January.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.