Markets
The D-Wave 2X quantum system, is operated at the NASA Advanced Supercomputing facility's Quantum Artificial Intelligence Laboratory at NASA's Ames Research Center in Mountain View, Calif., as seen on Tuesday December 8, 2015.
The D-Wave 2X quantum system at the NASA Advanced Supercomputing facility (Michael Macor/Getty Images)

Unpacking the science behind the “quantum supremacy” breakthrough

It’s like the “Bourne Supremacy,” but for fancy computers.

When assessing the commercial viability of quantum computing, one of the basic things to answer is, “What can you do that a classical computer can’t?”

Most attempts to establish this so-called “quantum supremacy” have revolved around simply trying to out-compute classical computers, without much regard for whether the end product of that compute has any utility or not.

Which leads to the second question: “Well, can you do anything a classical computer can’t that could make or save me money right now?”

On March 12, D-Wave Quantum claimed the company had answered both these questions in the affirmative based on a peer-reviewed paper published in the journal Science.

That announcement, along with a very encouraging set of quarterly results, caused the stock to double in just three sessions.

But with all due respect to the authors, the report is completely inscrutable to those of us whose science education never included physics to begin with and ended with chemistry in 11th grade. Even the journalist failsafe of “read the abstract, read the conclusion, and you’ll kind of know what’s going on” is rendered completely useless when the abstract contains such phrases as “area-law scaling of entanglement in the model quench dynamics” and “stretched-exponential scaling of effort.”

When we recently spoke with D-Wave Quantum CEO Dr. Alan Baratz, one of the first things we asked was what the heck all this actually meant. Basically, quantum computers were able to identify what types of materials can make a good sensor and how to make them the most sensitive sensors they can be. Heres his longer explanation (emphasis added):

“Essentially what weve done is we have computed several different properties of magnetic materials. But to put a little bit finer point on that, what we are looking at is how these materials behave when they get close to whats known as a phase transition.

OK, so whats a phase transition? Thats like water freezing. Thats a phase transition. Or water boiling, and a gas is created. Well, magnetic materials also go through a phase transition, but that phase transition occurs not as a result of temperature changes necessarily, but as a result of putting them inside a magnetic field. Youve got a magnetic material that you put inside a magnetic field, and depending on the actual structure and strength of that magnetic field, that magnetic material may go through a quantum phase transition. Now the reason why the phase transitions are so important in magnetic materials is because a lot of times magnetic materials are used as sensors, like in an MRI. And what we know is that if the magnetic material is close to its phase transition point, it becomes a much more sensitive sensor. It can detect more and smaller properties, or more faint properties. So what you want to do for any magnetic material, youd like to understand where its phase transition point is and youd like to understand its sensitivity as it gets close to that point, because that will help you identify materials that are good sensors and help you determine how you should operate those materials, what kind of a magnetic environment you should place them in as youre using them as a sensor.

So thats essentially what weve done. Weve demonstrated that you can take a variety of different types of magnetic materials, you can put them in a magnetic field, to get them right to their phase transition point. You can find out what that phase transition point is, and you can find out their sensitivity at that phase transition point. And thats a really important set of properties to understand as youre thinking about using these materials as sensors. Now, the upshot of all of this is that you can investigate new kinds of materials that have never been created before and determine if they make good sensors before you actually go try to fabricate them. So you can identify new types of materials much faster.”

A little more, from the company’s press release on this breakthrough:

“Magnetic materials simulations, like those conducted in this work, use computer models to study how tiny particles not visible to the human eye react to external factors. Magnetic materials are widely used in medical imaging, electronics, superconductors, electrical networks, sensors, and motors...

Materials discovery is a computationally complex, energy-intensive and expensive task. Today’s supercomputers and high-performance computing (HPC) centers, which are built with tens of thousands of GPUs, do not always have the computational processing power to conduct complex materials simulations in a timely or energy-efficient manner.”

When asked how this was different from what Alphabet was able to pull off last December with its Willow chip, Baratz replied (emphasis added):

“The problem that they address with Willow is called random circuit sampling. So basically what you do is you take a quantum computer and you have it perform a random set of computations that have no value whatsoever. Nobody can do anything useful with this random sequence of computations, but you have it perform a random sequence of computation. And then you see if a classical computer could do the same thing. And what you find is that because these random computations are quantum mechanical computations, its very hard for classical computers to simulate them.

Right. But thats all theyve done. Theyve built a quantum system. Theyve had it perform a random sequence of quantum computations, and then they ask, how hard would it be for a classical computer to simulate that? And the answer is, it will be very hard. Now, what is important about Willow — because it was an important breakthrough — is that Google tried to do this in 2019 and they claimed quantum supremacy back then on this totally worthless problem. Interesting, but worthless. OK. The problem is shortly after that, it was shown that you actually could perform that computation classically.

Why? Because the Google system was so error-prone that you could only do relatively few of these computations before you got errors. So I think the circuit depth, or the number of computations you could do, is like 22 or 23, something like that. What Willow did was it added some partial error correction to the system. And what they showed is that with partial error correction, they could do a longer sequence of these random computations and that longer sequence could not be simulated classically. So there were two important things that came out of Willow. One: it is a demonstration that actually, you can do some partial error correction. Namely, theres a first demonstration of error correction on a quantum computer. Its small, its partial, but its a step forward. Two is that when you do that partial error correction, you can run longer computations before you get errors, and long enough that classical probably cannot simulate it.

So that’s what Willow did. What we did is something very different. We’re not doing random anything. We are taking a real-world problem and basically performing the computation for that problem, which would be effectively impossible for a classical computer to perform. And those two are very different.”


If that didn’t help, maybe this will:

Beyond-classical computation in quantum simmulation
Source: Science

Yeah, totally clears it up.

More Markets

See all Markets
markets

Diverse partnership’s $40 billion data center may the future of funding for AI

Another day, another multibillion-dollar data center deal.

The announced $40 billion buyout — including debt — of Texas-based Aligned Data Centers on Wednesday was the first for a consortium established last year by a diverse base of investors including giant money management firm BlackRock, Abu Dhabi-based technology investment fund MGX, and Microsoft.

Some analysts suggest the variety of investors in such a deal — including tech giants, sovereign investment funds and the private pools of capital controlled by entities like BlackRock — will be an increasingly common site, as the enormously expensive buildout of AI infrastructure continues over the coming years.

Analysts at Morgan Stanley recently estimated that there will be some $2.9 trillion of spending on data centers globally by 2028. Some $1.4 trillion of that will be covered by the cash flows produced by giant hyper scalers like Microsoft, leaving a need for some $1.5 trillion from other sources. The analysts wrote that their “broad takeaway was bullishness on the availability of those sources of capital.”

Some analysts suggest the variety of investors in such a deal — including tech giants, sovereign investment funds and the private pools of capital controlled by entities like BlackRock — will be an increasingly common site, as the enormously expensive buildout of AI infrastructure continues over the coming years.

Analysts at Morgan Stanley recently estimated that there will be some $2.9 trillion of spending on data centers globally by 2028. Some $1.4 trillion of that will be covered by the cash flows produced by giant hyper scalers like Microsoft, leaving a need for some $1.5 trillion from other sources. The analysts wrote that their “broad takeaway was bullishness on the availability of those sources of capital.”

markets

Rigetti Computing tanks amid souring retail sentiment, bearish options bets

Rigetti Computing is getting taken to the woodshed on Wednesday amid souring retail trader sentiment and options bets on near-term downside.

In particular, one post on Reddit’s wallstreetbets forum from user bespoketrancheop, which shows the Google Street view (circa March 2025) of Rigetti’s listed headquarters, is generating a lot of attention. It’s the most popular Rigetti-centric post on the subreddit in the past seven months.

Rigetti HQ
r/wallstreetbets via bespoketrancheop

Per our executive editor, it’s giving this:

Clinton meme
Source: imgflip

But as one commenter notes, this isn’t exactly new news: “People been posting this since it was $11,” with another pointing out that “making an assessment on a google street view is lazy dd [editor’s note: due diligence].”

For what it’s worth, Rigetti’s Quantum Fab manufacturing facility in Fremont looks a lot more like a place where next-gen technology is being developed and a lot less like the middle school one of my colleagues went to.

Of course, it’s impossible to single this out as <the> specific catalyst for the price action in Rigetti today. But since there’ve been dozens of days in the past couple months where quantum computing stocks went up on no news whatsoever, it stands to reason there are also going to be days when they go down for no (good) reason whatsoever.

More important, perhaps, is the flurry of major options bets positioning for downside in the quantum computing company this week. Put options with a strike price of $50 that expire this Friday are in demand. That contract had open interest of under 7,000 heading into today but has already seen volumes of more than 30,000, suggesting fresh wagers made on a pullback in the formerly high-flying stock.

markets

AMD soars as HSBC hikes price target to a Street high of $310

Shares of Advanced Micro Devices are soaring after HSBC analyst Frank Lee strapped his price target for the chip designer to a rocket ship, hiking it to $310 from $185. The new price target ties that of Arete Research’s Brett Simpson for the highest on Wall Street, per data from Bloomberg.

The recently announced deal with OpenAI, which was followed by news that AMD will deploy 50,000 AI chips in Oracle’s data centers, catalyzed a massive wave of Wall Street love for AMD.

But that’s just nowhere near enough compared to what the stock deserves, per Lee, who sees AMD’s MI450 series of AI chips as being sufficiently competitive to Nvidia’s offerings. Through 2030, he sees the revenue opportunity of the OpenAI deal to be $80 billion.

“We believe the Street has underestimated the AI GPU revenue with our estimates 50% and 45% above consensus for 2026e and 2027e, respectively,” he wrote. “We believe there could be further upside driven by pricing premium as well as additional AI GPU volume.”

HSBC on AMD revisions
Source; HSBC

Latest Stories

Sherwood Media, LLC produces fresh and unique perspectives on topical financial news and is a fully owned subsidiary of Robinhood Markets, Inc., and any views expressed here do not necessarily reflect the views of any other Robinhood affiliate, including Robinhood Markets, Inc., Robinhood Financial LLC, Robinhood Securities, LLC, Robinhood Crypto, LLC, or Robinhood Money, LLC.