Markets
markets

CoreWeave guidance disappoints as delays weigh on data center ramp despite blowout top and bottom line beats

CoreWeave reported a strong sales beat in Q3, with bottom-line results to match.

  • Revenue: $1.36 billion (compared to analyst estimates of $1.23 billion and guidance for $1.26 billion to $1.30 billion)

  • Adjusted operating income: $217.15 million (estimate: $177.2 million, guidance: $160 million to $190 million)

Those figures exceeded every estimate among analysts polled by Bloomberg.

More strong sales seem to be in the pipeline: CoreWeave’s revenue backlog swelled to $55.6 billion at the end of the quarter, nearly double the $30.1 billion at the end of Q2.

But they’re not imminent: in fact, despite this revenue beat, CoreWeave reduced its 2025 annual sales forecast to a range of $5.05 billion to $5.15 billion from its prior outlook for $5.15 billion to $5.35 billion.

That cut to its guidance has shares deep in the red on Tuesday morning during premarket trading.

CoreWeave seems to be having a little trouble getting as much compute up and running as Wall Street had hoped for, with active power of 590 megawatts at the end of the quarter, where analysts had anticipated nearly 625 megawatts.

On the earnings call, the company’s executives discussed a delay to one of their data centers in more detail, a problem which is weighing on its Q4 and FY25 guidance. To be clear, CoreWeave isn’t flagging access to power in particular as a critical bottleneck right now (unlike Microsoft’s and Nvidia’s leaders). Rather, it’s the other physical infrastructure supporting the data center that’s the issue.

Michael Intrator, CoreWeave’s CEO, said:

So you're going to be hearing this theme repeated again and again as you talk to not just CoreWeave, but across the space. And it is a real challenge at the powered shell level. It's not a challenge for power, right? There's plenty of power right now, and we believe that there will be ample power for the next couple of years. But really where the challenge is, is the powered shell.

Accordingly, CoreWeave’s guidance for over 850 megawatts of active power at year end would entail the company falls well short of the current consensus estimates for nearly 900 megawatts.

It’s going to take a lot of supply chain unfurling to realize its revenue backlog on schedule.

CoreWeave revenue backlog
Source: CoreWeave Q3 earnings presentation

The neocloud company had a busy quarter, reaching a $14 billion pact with Meta for AI compute, expanding its agreement with OpenAI, and signing a $6.3 billion deal with Nvidia for any unused cloud computing capacity, among others. CoreWeave’s recent attempt at vertical integration failed, as Core Scientific shareholders voted overwhelmingly against its proposed acquisition on October 30.

However, there’s a little less drama around this quarter’s results than there was for the last one. That’s because its lock-up period expired shortly after CoreWeave’s impressive Q2 results, catalyzing a wave of profit taking in the AI darling.

More Markets

See all Markets
markets

Nvidia strikes licensing agreement with AI inference specialist Groq

Nvidia reached an agreement to work with AI chip startup Groq to enhance its inference capabilities.

CNBC is calling this a $20-billion acquisition in cash, citing the top investor in Groq’s latest financing round (which valued it at roughly $6.9 billion in September). Groq’s press release on the matter, however, refers to this only as a “non-exclusive licensing agreement” and that “Groq will continue to operate as an independent company,” with no financial details provided. The lack of an official acquisition may be a bid to duck any potential antitrust concerns.

However, this is definitively an acqui-hire, as Groq founder Jonathan Ross and president Sunny Madra, as well as other members of their team, will be joining the chip designer “to help advance and scale the licensed technology.”

Inference is the “thinking” part of AI models (as opposed to training, which is more of the “learning”). Groq’s AI chips are LPUs (language processing units), distinct from GPUs (graphics processing units) or TPUs (tensor processing units). The company boasts that these chips “run Large Language Models (LLMs) and other leading models at substantially faster speeds and, on an architectural level, up to 10x more efficiently from an energy perspective compared to GPUs.” These products don’t need external high-bandwidth memory chips (which are facing a supply crunch), but rather use a different method of on-chip memory (SRAM, or static random-access memory).

Through this deal, Nvidia is likely looking to boost the efficiency of its AI solutions in a power-hungry (and scarce) world. It may also be viewed as a response to the success of Google’s Gemini 3 model, which utilizes TPUs that are also cheaper to operate than Nvidia’s GPUs. (In a fun twist, Ross, the Groq founder, was one of the architects of what would become Google’s first TPU during his time with the search giant).

“We plan to integrate Groq’s low-latency processors into the NVIDIA AI factory architecture, extending the platform to serve an even broader range of AI inference and real-time workloads,” wrote Nvidia CEO Jensen Huang in an email to employees, as reported by CNBC.

Good news for Groq is also good news for one of America’s most controversial and outspoken VCs: Chamath Palihapitiya, whose Social Capital fund was an early investor in the company. Chamath’s SPACs have generally tended to go over like a lead zeppelin, but this investment is already a massive winner.

markets
Luke Kawa

Micron jumps amid report of memory chip price hikes

Shares of Micron are catching a bid on Wednesday after South Korean media reported that its biggest competitors are raising selling prices for a line of high-bandwidth memory chips even though these will soon no longer be the most cutting-edge offerings available.

“According to industry sources on the 24th, memory semiconductor companies such as Samsung Electronics and SK Hynix have reportedly raised HBM3E supply prices by nearly 20%,” per the report from Chosun Biz. “This is unusual, considering that prices typically drop ahead of next-generation HBM launches. The prevailing view is that this is due to upward adjustments in HBM3E orders for next year from companies like Google and Amazon, which design their own AI accelerators, as well as NVIDIA, the largest HBM3E customer.”

Micron, along with those two companies, make up the triumvirate of high-bandwidth memory chip suppliers. These companies are all moving towards ramping their next-gen HBM4 production next year.

Meanwhile, appetite for HBM3E is being reinforced in part by President Trump’s move to allow Nvidia to sell its H200 chips to China.

markets
Luke Kawa

Opendoor acquires HomeBuyer.com in bid to boost home flipping and mortgage opportunities

Opendoor Technologies has acquired mortgage services platform HomeBuyer.com, according to a post on X from Chief Growth Officer Morgan Brown. Brown did not disclose financial terms of the deal in the post.

There’s an element of an acqui-hire here too, as HomeBuyer.com founder Dan Green will serve as Director of Mortgage Growth for Opendoor.

HomeBuyer.com offers tools for potential home buyers to assess their financing options, and mortgages are a logical avenue for Opendoor to pursue as the online real estate company looks transform the home buying and selling process in the US. At the very least, streamlining the financing process for potential buyers under its own roof should help Opendoor’s quest to pursue higher volumes of homes flipping.

Shares of Opendoor are little changed in premarket trading.

Many Opendoor bulls, including EMJ Capital’s Eric Jackson, have pointed to Opendoor’s potential to bolster its presence in mortgage, title, and other housing services as part of their optimistic view on the stock. In November along with the release of Q3 earnings, CEO Kaz Nejatian announced a new partnership with Roam pertaining to assumable mortgages.

Opendoor certainly hasn’t been idle during the holiday season. Earlier this week, the CEO touted an explosion in the company’s home-buying footprint to include all of the lower 48 US states, and management also announced that Coinbase Canada CEO Lucas Matheson was coming in to serve as its president.

markets
Luke Kawa

Intel drops on report that Nvidia stopped testing the 18A chip production process used by the chip manufacturer

Early on Christmas Eve, shares of Intel are tumbling like Santa off a rooftop after one too many spiked egg nogs.

Reuters reports that Nvidia “recently tested out whether it would manufacture its chips using Intel’s production process known as 18A but stopped moving forward, two people familiar with the matter said.”

Intel, for its part, told Reuters that its 18A processes are “progressing well” while it “continues to see strong interest” for its more advanced 14A production process. Previous reporting from the outlet indicated that in CEO Lip-Bu Tan’s early days leading Intel, he considered shelving the 18A manufacturing process entirely in favor of 14A in a bid to be more competitive with the likes of TSMC.

The $4 trillion chip designer announced a $5 billion investment in the chipmaker back in September as part of a collaboration that would see the two parties co-develop data center and PC products. That news sent shares of Intel up 23% in a single session, their biggest one-day gain since 1987.

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.