Tech
Yann Le Cun meta AI
Meta’s chief AI scientist, Yann LeCun (Julien De Rosa/Getty Images)
GP-who?

Just four companies are hoarding tens of billions of dollars worth of Nvidia GPU chips

Each Nvidia H100 can cost up to $40,000, and one big tech company has 350,000 of them.

Jon Keegan

Meta just announced the release of Llama 3.1, the latest iteration of their open source large language model. The long-awaited, jumbo-sized model has high scores on the same benchmarks that everyone else uses, and the company said it beats OpenAi’s ChatGPT 4o on some tests. 

According to the research paper that accompanies the model release, the 405b parameter version of the model (the largest flavor) was trained using up to 16,000 of Nvidia’s popular H100 GPUs . The Nvidia H100 is one of the most expensive, and most coveted pieces of technology powering the current AI boom. Meta appears to have one of the largest hoards of the powerful GPUs. 

Of course, the list of companies seeking such powerful chips for AI training is long, and likely includes most large technology companies today, but only a few companies have publicly crowed about how many H100s they have.  

The H100 is estimated to cost between $20,000 and $40,000 meaning that Meta used up to $640 million worth of hardware to train the model. And that’s just a small slice of the Nvidia hardware Meta has been stockpiling. Earlier this year, Meta said that it was aiming to have a stash of 350,000 H100s in its AI training infrastructure – which adds up to over $10 billion worth of the specialized Nvidia chips. 

Venture capital firm Andreesen Horowitz is reportedly hoarding more than 20,000 of the pricey GPUs, which it is renting out to AI startups in exchange for equity, according to The Information

Tesla has also been collecting H100s. Musk said on an earnings call in April that Tesla wants to have between 35,000 and 85,000 H100s by the end of the year.  

But Musk also needs H100s for X and his AI company xAI. This week, Musk boasted on X that xAI’s company’s training cluster is made up of 100,000 H100s. 

A tweet from Elon Musk stating that xAI has 100,000 H100 GPUs.
Source: X @elonmusk https://x.com/elonmusk/status/1815325410667749760


Musk was recently sued by Tesla shareholders for allegedly re-directing 12,000 of the H100s intended for the car maker’s AI training infrastructure to xAI instead. When asked about this diversion in yesterday’s Tesla Q2 earnings call, Musk said that the GPUs were sent to xAI because “the Tesla data centers were full. There was no place to actually put them.”

The H100s are in such demand that people are being paid to sneak them into China, to bypass U.S. export controls. You can watch unboxing videos of these graphics cards, and there are even a few for sale on Amazon – including one for $34,749.95 (with free delivery).

OpenAI hasn’t said how many H100s they are sitting on, but The Information reports that the company rents a cluster of processors dedicated to training from Microsoft at a steep discount as part of Microsoft’s $10 billion investment in OpenAI. The training cluster reportedly has the power of 120,000 of Nvidia’s previous gen A100 GPUs, and will be spending $5 billion to rent more training clusters from Oracle over the next two years, according to The Information’s report. OpenAI does appear to have a special relationship with Nvidia — in April, Nvidia CEO Jensen Huang “hand-delivered” the first cluster of the company’s next generation H200 GPUs to co-founders Sam Altman and Greg Brockman. 

A tweet by OpenAI’s Greg Brockman with a photo featuring Brockman, OpenAI CEO Sam Altman and Nvidia CEO Jensen Huang
Source: X @gbd https://x.com/gdb/status/1783234941842518414

Nvidia declined to comment for this story, and Meta, X, OpenAI, Tesla, and Andreessen Horowitz did not respond to requests for comment. 

More Tech

See all Tech
tech

AI agent fatigue may be hitting enterprise customers

You may have noticed that recently, every piece of business or productivity software seems to have an “AI agent” feature that keeps getting pushed in front of you, whether you want it or not.

That’s leading to AI agent fatigue among enterprise customers, according to The Information.

Companies like Salesforce, Microsoft, and Oracle have been pushing their AI agent features to help with tasks such as customer service, IT support, and hiring. But many of those features are all powered by AI services from OpenAI and Anthropic, leading to a similar set of functions, according to the report.

As companies race to tack on AI agents to their legacy products, it remains to be seen which functions will become the “killer app” for enterprise AI.

Companies like Salesforce, Microsoft, and Oracle have been pushing their AI agent features to help with tasks such as customer service, IT support, and hiring. But many of those features are all powered by AI services from OpenAI and Anthropic, leading to a similar set of functions, according to the report.

As companies race to tack on AI agents to their legacy products, it remains to be seen which functions will become the “killer app” for enterprise AI.

tech

Google’s Waymo has started letting passengers take the freeway

Waymo’s approach to robotaxi expansion has been slow and steady — a practice that has meant the Google-owned autonomous ride-hailing service that launched to the public in 2020 is only just now taking riders on freeways.

On Wednesday, Waymo announced that “a growing number of public riders” in the San Francisco Bay Area, Phoenix, and Los Angeles can take the highway and are no longer confined to local routes. The company said it will soon expand freeway capabilities to Austin and Atlanta. It also noted that its service in San Jose is now available, meaning Waymos can traverse the entire San Francisco Peninsula.

Waymo’s main competitor, Tesla, so far operates an autonomous service in Austin as well as a more traditional ride-hailing service across the Bay Area, where a driver uses Full Self-Driving (Supervised). On the company’s last earnings call, CEO Elon Musk said Tesla would expand its robotaxi service to 8 to 10 markets this year.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.