FINAL BOSS
Exciting Atoms
Quantum computing reaches its Infleqtion point
Infleqtion filed its S-4, paving the way for it to go public via a SPAC merger with Churchill Capital Corp X this quarter.
A new quantum computing company will soon be arriving in public markets.
On Tuesday, Infleqtion and Churchill Capital Corp X filed their S-4 with the US Securities and Exchange Commission, a prerequisite for the former to combine with the latter and go public via a SPAC merger. The two companies expect this deal to close in Q1, at which point CCCX would begin trading under the ticker INFQ.
Infleqtion leverages neutral atom technology for quantum, unlike its publicly traded peers that use trapped-ion (IonQ) or superconducting tech (Rigetti Computing, D-Wave Quantum).
The company already caught some attention ahead of the completion of this combination back in October, when Citron Research recommended that investors go long CCCX and short Rigetti, calling it “the MOST compelling trade in quantum” and saying that the former was “far ahead” of the latter in developing quantum technology.
Infleqtion’s pre-money equity valuation of $1.8 billion, announced on September 8, is far, far below that of its publicly traded peers.
We spoke with Infleqtion CEO Matthew Kinsella in early October, while publicly traded quantum computing stocks were going parabolic, to learn more about the company and his post-SPAC plans.
Some highlights include:
Details on its product line. Infleqtion doesn’t just make computers, but generates the majority of its sales through advanced sensors (like clocks and RF antennas) that have military applications.
“You can think of all those products as kind of like a complexity continuum, all based on our core neutral atom technology. It kind of follows the Nvidia monetization model, which was, ‘Take this engine, point it at gaming first,’ right? Then crypto mining, and then ultimately the large language model crown jewel came about. We’ve pointed this at clocks, and now our antennas, sensors, and then also eventually these quantum computers will be the crown jewel once they become commercially useful.”
How Infleqtion is using sales to early adopters in the public sector to bolster its private sector opportunities.
“My strategy for the company is to, for lack of a better term, feed the dogs who are eating the dog food. Now in the national security world, ramp up volumes, use those volumes to drive down costs, and then address the broader commercial market opportunities.”
The first real-world problems that quantum computers will be able to solve.
“And early applications will likely be in the development of new materials, and that’ll be a massive opportunity, right? New jet fuels, or what is Elon Musk struggling right now with Starship? It’s a material science problem, right? He’s got to figure out what materials to actually build Starship out of that can withstand the blastoff and going through orbit and coming back down multiple times. We just don’t have materials that can do that now. Those are the types of things that the earlier-stage quantum computers will be able to help with, and that’s when you’ll start to see the customer base expand outside of that core kind of public domain that you talked about into more private companies.”
How Infleqtion is already integrating AI and quantum computing.
“Turns out, if you apply our quantum software with this re-architected memory on GPUs, it shows some pretty amazing expansion of the context window. And so you’re almost pulling forward some of the quantum advantage you’d need to wait for a quantum computer to show onto GPUs via software. And the use cases for that are a lot. But the ones we’re starting to monetize are with the Army and the Navy ingesting huge amounts of streaming data, whether that’s from sensors, multimodal sensors, and actually running that and making sense of it on a deployed-edge Nvidia Jetson GPU, where normally the large language model would’ve been totally overwhelmed by that context window.”
On whether he left too much money on the table with his valuation.
“I had to thread the needle between being fair to our existing investors from a dilution perspective, and I think this $1.8 billion pre-money valuation is a fair value to have dilution to effectively capitalize the business to succeed in many, many states of the world. But most importantly, set it up such that it kind of looked like a no-brainer for new investors, because what’s the most important thing to do to? Get the deal done, to get that cash on our balance sheet. I didn’t want to be near-term greedy with the valuation. I wanted to be long-term greedy to play to win with the cash on the balance sheet. So that was really how I thought about it. And yeah, if you look at our valuation relative to the publicly traded comps, I think it’s set up to look very, very attractive, especially, as you pointed out, given our financial profile relative to theirs.”
This transcript has been lightly edited for clarity.
Sherwood News: For starters, tell me more about Infleqtion’s product line. From my understanding, quantum computers and precision sensors are the two primary areas of focus, and there’s software tied into that, kind of to the extent you could explain the differences in effectively the product set.
Matthew Kinsella: First of all, when we talk about quantum, we’re talking about the world of the very small and some of those really bonkers phenomena that happen down there, like entanglement and superposition. And when someone says quantum, a lot of times people think the next word that follows is just computing.
But the reality is that those bonkers phenomena can be used to create other types of products that have real improvement in performance relative to classical standards and are based upon the underlying quantum modality. And when I say modality, I mean, how are you taking advantage of those really weird phenomena and trying to turn them into products?
The various modalities have varying levels of flexibility on what you can do with them. Our modality, neutral atoms, is an incredibly flexible quantum modality, largely because all of it takes place at room temperature, and therefore we can actually build this suite of products that you are referencing.
But it’s important to remember that they’re all effectively the same. At the most basic level, we’re trapping atoms inside ultra-high-vacuum cells. We call that our neutral atom core. And then we turn those atoms into a suite of products based upon how we interact with them with lasers. You can think of the suite of products as sort of a continuum of complexity on what you can build with this neutral atom technology.
The least complex, but still highly complex, product is our clock. With that clock, we effectively turn the energy transition of atoms based upon exciting them with lasers into a very stable frequency reference, the ticking of a clock that is incredibly stable, but also ticks about 1,000x more precisely than anything in a form factor like that.
We are taking advantage of the superposition of those atoms, for lack of a better term, and turning that energy transition zero to one. That happens very stably and very quickly into the ticking of the clock. And there are huge market opportunities for precision clocks, increasingly so because of the susceptibility of GPS to denial and spoofing.
GPS is, at its core, a time distribution system. It’s a PNT — position, navigation, and timing — but timing’s the key service, and so much of our critical infrastructure is based upon the ability to access GPS from a synchronization perspective. So think of that as the first sensing product that we’ve built that is in a three-use or sort of three-pizza-boxes-big form factor, or the size of an Xbox. And ultimately that will be 1U or one pizza box, and then eventually chip-scale, because inside of that pizza box is really just a bunch of lasers and then one of these quantum cores, and that can all get integrated down to a photonic integrated circuit chip-scale.
That’s the path for all of our products over time. So there’s product number one. Product number two is our quantum RF antenna. In the same way we can turn those atoms into the ticking of a clock, we can excite those atoms into what’s called their Rydberg state, and that’s as excited as they can be before they become an ion, so the electron goes away. Think of the electron way out in orbit, and then you can think of it almost as a massive dipole, and that is now incredibly sensitive to the entire electromagnetic spectrum. So you’re now taking advantage of this state of the atom, this quantum state where it becomes an RF receiver, but a very almost magical RF receiver, because normally you’d need a different antenna for the different slugs of the frequency spectrum.
This atom antenna can be tuned to receive the entire frequency spectrum from hertz to terahertz. That’s game-changing for a lot of reasons, because now you can collapse the multiple antennas that are in your car or on a battleship, or maybe at some point in your phone — you probably have 10 antennas in your iPhone, because you’re receiving 10 different slugs of the spectrum — down to one.
Or the really crazy thing is at the very low-frequency, long wavelengths, you need an antenna that’s up to a kilometer long, because with classical RF, the antenna has to approximate the size of the wavelengths. You probably maybe put a giant antenna on your roof one time, right? Or at least you’ve seen them.
That’s because it’s receiving a really long wavelength, and the antenna needs to be big enough to capture that. We could collapse all that something down to something the size of a sugar cube, and that Rydberg state is the same state that we have to use to entangle atoms. So in some ways, that RF antenna is taking advantage of entanglement, or at least the state of the atoms to entangle them.
Then we can build sensors, gravitational sensors basically, that are incredibly sensitive to the Earth’s gravity. So we’ve sent those up into space back in 2018 with NASA and are working with them to send a lot more of them up into space. Those can track the melting of the polar ice caps, the pollution of aqua forests, but probably more interestingly, the burrowing of underground tunnels, the movement of nuclear materials — those all carry gravity stamps and those change as they move or happen.
Finally, those are effectively the three core building blocks to building a neutral atom quantum computer.
I did mention that you have to make your atoms very cold to do the internal sensing, but we don’t actually put them in a freezer. We hold them in place, so they’re moving so little. They become the coldest place in the known universe, because what cold is is the lack of motion of atoms. So you take superposition entanglement, cold, you put that into a blender, and those are the building blocks for a quantum computer. We also build quantum computers, and we’ve sold three of them.
The most important thing to remember is that this underlying technology is all very interrelated. And you can think of all those products as kind of like a complexity continuum, all based on our core neutral atom technology. It kind of follows the Nvidia monetization model, which was, “Take this engine, point it at gaming first,” right? Then crypto mining, and then ultimately the large language model crown jewel came about.
We’ve pointed this at clocks, and now our antennas, sensors, and then also eventually these quantum computers will be the crown jewel once they become commercially useful.
Sherwood: You hinted at it there about the neutral atom technologies, which I think is the defining feature of Infleqtion. What would you say some of the biggest advantages are in terms of the development, and what are some of the complexities or peculiar challenges to this type of development? Because if it were the easiest and the best, everyone would be doing it versus trapped ions or versus superconducting circuits.
Kinsella: I would say there are reasons why superconducting and trapped ions got early leads. And if you look at when the first research was done on all the different modalities — and this will answer your question, kind of by way of this history — ions carry a charge, right? They’re like the Rydberg state I was mentioning to you, but a little further. So the electrons have gone away, and now these ions have a charge because they carry that charge and they repel each other. They actually are a lot easier to work with than neutral atoms. And because they can be controlled and trapped using microwave technology, they got their start back in the mid-’90s, largely because the fundamental technology was there to explore them as a quantum modality.
So they’ve been the most invested in and explored modality over the course of the last, at this point, 30 years or so. But that charge also ends up being their worst enemy in some ways, because that means you can’t pack that many of them into one of their cells. And that’s why they need to figure out how to photonically integrate and network those cells together such that you have many cells of a small number of qubits that ultimately will be enough qubits to do something useful.
Now, that integration technology is not invented yet. It’s on the come as to whether they’ll be able to photonically integrate those. Superconducting got its start maybe a decade later, call it the mid-2000s. And it got its start because it maps better to our mental model of what a computer is than any of the other quantum modalities, because you actually build physical little superconducting things and those become your gates. So it feels very conceptually similar to how you build a classical computer, with actual bits made of little pieces of silicon. In this case, it’s made of superconducting materials. The issue with that, though, is similarly, you have to build each of those, and that is challenging to scale.
In addition, they’ve got to reside in these huge freezers, which also are quite power-consuming and difficult to scale. So why did neutral atoms not get its start as early as those other ones? The issue is the photonics. You actually need to control these atoms very precisely with lasers, and that’s much harder to do than with ions, because you can use microwaves. What has happened, though, is that the photonics technology has caught up, which is why you’ve seen neutral atoms make this huge leap forward over the course of the last five years from kind of being a dark horse quantum computing candidate to now leading in both the quantity and quality metrics, which are really the two that matter in quantum computing. You can scale hundreds of thousands, if not millions, of physical qubits inside one of our quantum cores, whereas you probably top out at about a hundred in some of the trapped-ion cores, and then you need to figure out how to photonically integrate them. Those are some of the key differences.
The neutral atom modality is very strong at its base because it’s highly flexible, like we explained before. You can actually build these other types of quantum products based upon it. And then specifically from a compute perspective, it’s highly scalable. The big question historically had been, could you actually have the photonics technology to control those atoms such that their quality, as measured by a metric called gate fidelities, would be high enough such that you could actually do useful things?
What we’ve proven over the course of the last several years is that the quality has gotten significantly better on neutral atoms, and quantity had never really been a problem. If you look at — and there’s a publicly available dataset, which we have a snapshot of in our deck — but it’s up on GitHub. Someone tracks all the different data points for both quality and quantity for all the quantum computing modalities. If you look at neutral atoms, it’s truly an order of magnitude higher than trapped ions or superconductors in that they both tapped out at about 100 physical qubits.
We at Infleqtion have shown 1,600 physical qubits, and then Caltech actually just showed with one of our pieces of technology 6,100 physical qubits, all based on the neutral atom modality. And from a quality perspective, the gate fidelities — when I first invested in this company, before I was the CEO (I was the first investor), our gate fidelities were atrocious. They were just barely better than a coin flip. But the rate of increase has been astonishing. We’re now at 99.73% gate fidelities, almost at that three-nines mark, but importantly above the 99.5% threshold level where you can actually start to solve quality with more quantity.
Because quantity is such a feature for neutral atoms, you can actually drive up the quality of the system by having more physical qubits. And that’s why neutral atoms are the leaders in logical cubits, which is really all that matters in quantum computing.
Sherwood: To change tacks a little, obviously you have customers both on the public and research side and on the private side. Looking at the quantum space, you’d be remiss not to notice that so far a lot of the action is on the public side. Is this something that is unique to quantum computing in its applications, or is this just how nascent technologies and emergent technologies initially kind of ramp up the commercialization curve?
Kinsella: The latter. I mean, it reminds me of semiconductors — not that I was around for this, but having read about semiconductors in the ’60s, right? They got their start calculating trajectories of projectiles, and then it kind of grew from there. And the Minuteman missiles where the first Intel products were embedded.
That has historically been the start of many of these nascent but growing technologies. You see the need inside the broader national security realm before you see it in other places.
I’d say it’s going to play out a little differently for both the computing and the sensing sides of our business. The biggest customer for our sensors today is indeed the government, and that’s largely because the precision levels we can provide are actually beneficial for the DOD or some of the national security-focused players or the intelligence community. And they’re willing to pay the higher prices that we have to ask for now, based upon the form factor as well as just the volumes.
My strategy for the company is to, for lack of a better term, feed the dogs who are eating the dog food. Now in the national security world, ramp up volumes, use those volumes to drive down costs, and then address the broader commercial market opportunities. If you just look at clocks, we sold an $11 million contract to the DOD for our clocks, largely for what I think is used in GPS-denied environments to maintain synchronization.
Those clocks cost about $225,000 a pop. But the cost will be driven down relatively rapidly as we both integrate the photonics inside the clock and ramp up volumes. And then we can buy our components in bigger volumes and drive down the cost.
As we drive down that cost, then the data center market or the RF network market, the cellular network market, et cetera, all become available and we’ll address the commercial opportunities as they arise.
On the computing side, you know, this isn’t an Infleqtion statement, this is a every quantum computer statement: they don’t yet do interesting things commercially, right? You couldn’t go say, “All right, I’m going to go buy a quantum computer for $20 million and it’s going to be a useful tool for me.” There’s some business that can do that, but there’s just not a commercial market for it yet.
When people point to quantum advantage in computing, it’s very different than our quantum advantage in clocks, where I can definitively say our clocks are literally 1,000x more precise. And that’s a useful tool with computing. There are things that quantum computers can do that classic computers can’t do, but they’re mostly confined to the physics world today, and it won’t be until we get to that 100-logical qubit level or so where you start to see real commercial advantage. And early applications will likely be in the development of new materials, and that’ll be a massive opportunity, right? New jet fuels, or what is Elon Musk struggling right now with Starship? It’s a material science problem, right? He’s got to figure out what materials to actually build Starship out of that can withstand the blastoff and going through orbit and coming back down multiple times. We just don’t have materials that can do that now.
Those are the types of things that the earlier-stage quantum computers will be able to help with, and that’s when you’ll start to see the customer base expand outside of that core kind of public domain that you talked about into more private companies that can actually say, “OK, I can buy that $20, $30, $40, $50, $100 million quantum computer and then actually see a real return on that investment.”
And that’ll happen once we get past that 100-logical qubit threshold. That’s kind of the magic number where we start to see commercial advantage.
Sherwood: And how would you say, whether in relation to last 12 months’ revenue or the kind of bookings you have, how would you say that’s divided between public versus private?
Kinsella: So I’ll cut the revenue and the bookings a couple different ways for you. We did about $29 million in last 12 months’ revenue [as of June 30, 2025], and the majority of that is sensing. Call it 75-25 sensing versus compute. And then if you look at bookings, which is a little bit more of a near-term metric, it’s closer to 50-50, and that’s largely because we’ve won some large contracts in the compute side of things recently. As we go forward, I think it’ll probably hover around that 50-50 until we get to commercial advantage on computing. And then you’ll see the mix shift probably to computing pretty rapidly.
But the majority of the revenue is very much in the public domain today. The DOD is the biggest customer, the Ministry of Defense is a customer, a number of other government agencies are customers, and then we’ve sold our fair share of equipment into other types of entities like data centers, et cetera, but they just aren’t a huge portion of the demand now, largely for the reason I mentioned before, which is the precision level that we can provide.
Yes, a 1,000x better clock is awesome, but it’s still 2.5x more expensive than the clocks they’re buying now. And what we need to do is get it on price parity with those existing atomic clocks that they’re buying. Then it’s a no-brainer, because if you can have a 1,000x performance for something close, you’re going to do that all day long. With 2.5x, unless you have very specific use cases for that increase in performance, you might stick with what you have now. And so it’s really on computing, getting them to that level where they’re commercially useful. Then on our sensing products, it’s getting it down to price parity with the classical technologies.
Sherwood: For a more philosophical question, how do you view AI? Because I’ve seen surveys about AI effectively crowding out research budgets that could in theory be going to quantum. We’re all in a world where we’re competing for finite attention and cash. On the other hand, there are very clear opportunities for the two emergent technologies to work together symbiotically. So what’s your approach? What’s your take?
Kinsella: I’m very much in the camp that they’re going to be highly symbiotic, and I think you can look to what we did with Nvidia in December of 2024 as maybe a preview of what the data center will look like in the not-too-distant future — that is, just as GPUs layered into the data center and sat on top of CPUs, and enabled applications that weren’t possible with just CPUs, like large language models. QPUs will largely reside inside the data center and be layered on top of GPUs and enable different types of applications that weren’t possible with just CPUs and GPUs as part of the tool kit. So what you’ll see is workloads be sent into the data center, sort of hived off to the appropriate portion of the stack, and the really weird sticky stuff will go down to the QPU layer.
What we did with Nvidia was we showcased the running of a material science application called the Anderson Impurity model that was run on both their GPUs and our QPUs, and neither of them could have done it alone, but working together, they were able to run this application. It didn’t do anything commercially useful today, but it’s the precursor, and as we scale the power of the quantum computers, this is a photovoltaic application and will ultimately be what we would use to build better batteries.
That’ll truly, I believe, be a cross-collaboration between GPUs and QPUs doing things that GPUs just couldn’t do on their own. And that might be, develop a battery that lasts for 10 years instead of a day. That’s an absolutely massive improvement in performance and a huge benefit to humanity.
I think those are the types of things you’ll see the data center playing out in, and so I view AI and quantum as very deeply integrated going forward. In fact, the math underlying them isn’t that different. It’s matrix math. They’re just good at different parts of the matrix math equation.
If you ask me why or which part, I will go on below my level of mathematics to answer that question. But I do know they’re deeply, deeply intertwined. They also kind of self-reinforce each other in that — you can actually create artificial synthetic data using quantum systems to help train AI models.
Our quantum sensors sense orders of magnitude more information than classical sensors, which can be fed into models to train them. So I think quantum is a source of training for AI datasets for AI models. And then those AI models actually can be run to help you figure out how to better design your computers and set up your experiments to accelerate the path forward for quantum. So I think they’re highly, highly integrated.
The last thing I’ll say is — and this is kind of bonkers, but you know — when you write software for quantum computers, you have to think about the software entirely differently than you do for classical computers. That’s largely because of some of these strange properties of quantum mechanics, like the no-cloning theorem, which says you can’t copy and paste quantum data; it’s like a fundamental law. So you have to code around that fundamental law. It turns out copying and pasting is a very big part of classical software writing, so most of memory is based upon that core thing, but it’s one of the things that totally slows large language models down and creates that context window issue that is kind of the core scaling bottleneck to large language models.
It turns out, if you apply our quantum software with this re-architected memory on GPUs, it shows some pretty amazing expansion of the context window. So you’re almost pulling forward some of the quantum advantage you’d need to wait for a quantum computer to show onto GPUs via software. And the use cases for that are a lot.
But the ones we’re starting to monetize are with the Army and the Navy ingesting huge amounts of streaming data, whether that’s from sensors, multimodal sensors, and actually running that and making sense of it on a deployed-edge Nvidia Jetson GPU, where normally the large language model would’ve been totally overwhelmed by that context window. This actually expands it and allows it to increase its performance pretty considerably. So there are a lot of different ways that AI and quantum are tied together.
Sherwood: I had one question about the upcoming SPAC and the process of going public. You said yourself about $29 million in trailing revenues — of the four pure-play quantum companies I track, that’s more than the closest made in its last quarter combined. So a big question for me is, at a $1.8 billion valuation, why are you leaving so much money on the table?
Kinsella: Well, fair question. I’ll answer it this way. Let me back up: we were well-capitalized post our Series C. We had $100 million on our balance sheet. We have a much lower burn rate than many of the other quantum companies. We’ve burned about, call it $30 million in the last 12 months or so. And that’s largely because we have gross profit dollars coming in and also a pretty capital-efficient modality in and of itself.
Luckily, we don’t have to build anything. Well, we have to build systems, but the atoms are our qubits and we get those effectively for free. And they’re all the same and we just have to trap them, which is pretty nice. It helps on the bill on the materials side of things, but I thought from that position of strength, you know, what could we do to really change the game? To set us up and capitalize ourselves to really play to win? And I looked at a couple of different options to accomplish that goal.
Actually, this is a side note, a good friend of mine took his company public via a SPAC, and I was a seed investor in that company. It was a company called Hims. And it was wonderful, right? It worked out incredibly well for them. So I was talking to the founder, who was one of my closest friends, and he said, “You know, you really should look at SPACs, because it can be a very effective way to raise capital for certain types of businesses.”
All the other publicly traded quantum companies came public via SPAC, right? So there was a lot of precedent to do it. I kind of went in with four principles. Number one, could we find a great partner? Because that’s, I think, the most important thing. And Churchill, I don’t know if you are familiar with them, but they’ve been unbelievable, especially with delivering the cash in their trust onto the balance sheet of their partner customers. That was very attractive, let alone they’re just insanely smart people. So principle one, can you find a great partner? Check.
Principle two: Do they have an amount of capital that would actually really set you up for success? They had a $416 million trust, so check, and then we also raised $125 million PIPE on top of that. Could you — and this is the answer to your question, this third one — could you make sure you had a very high probability of that deal getting done? And that’s why I priced it as we did. So I had to thread the needle between being fair to our existing investors from a dilution perspective, and I think this $1.8 billion pre-money valuation is a fair value to have dilution to effectively capitalize the business to succeed in many, many states of the world. But most importantly, set it up such that it kind of looked like a no-brainer for new investors, because what’s the most important thing to do to? Get the deal done, to get that cash on our balance sheet.
I didn’t want to be near-term greedy with the valuation. I wanted to be long-term greedy to play to win with the cash on the balance sheet. So that was really how I thought about it. And yeah, if you look at our valuation relative to the publicly traded comps, I think it’s set up to look very, very attractive, especially, as you pointed out, given our financial profile relative to theirs. There’s also a state of the world where those companies could have all been cut in half, and I still wanted us to look really attractive. Instead, they all doubled in the last month, and so we looked really attractive.
But I wanted to set this up for success basically, is why I did that. And I think this is one of the things, honestly, that is kind of misunderstood about IPOs. People always say, “Oh, there’s this IPO pop.” Well, if you’re using the IPO as fundamentally a primary capital raise, you need to incent new investors to come in, right? And you need to make sure they have a good experience and can earn a return. Otherwise, why would they invest in you? So I’ve thought about this as a fair dilution, because this isn’t an exit for our existing investors. This is a capital raise, fair dilution, and then let’s make sure it looks really good for new investors, so they want to invest in the company. That’s the way I thought about it.
Sherwood: And you have $540 million I think coming in the door from this, so what are the priorities in terms of what you’re going to do with that money in terms of both building on the existing product set versus developing more quantum computing technology? How are you going to balance that? You get a $540 million check, where do you go shopping first?
Kinsella: Number one, I view us taking our burn up a bit, because we have some very high ROI investments to make and I will make those. But one of the things having been an investor for 18 years before coming to Infleqtion that I can trust myself on is I’m not going to turn into a drunken sailor and go spend all this very, very quickly. I’m going to use it for very high ROI uses of cash. Those fall into three buckets, and then there’s a fourth bucket that I’ll mention, too.
So number one is go-to market. There are huge opportunities for our technology and I do think we’ve kind of starved our go-to market organization, and we really need to be playing to win. That means having more people out there evangelizing why you should buy our products. And let me just give an example: you’ve probably heard of this Golden Dome program that is going up. That is an absolutely massive opportunity for us. Just for one example, in order to take out a hypersonic missile coming at the US, you have no time at all. The only way you can do that is if your entire system is synchronized to the picosecond level. And really the only way to do that is with quantum clocks. So, massive opportunity. We need to make sure the buyers are aware of our products. So go-to markets, number one.
The next two are sub-bullets of research and development, but quite different. I’d mentioned we are going to take our clocks from three pizza boxes to one pizza box, to chip-scale, and do that with all of the different sensing products — accelerating that process of costing these products down and shrinking their form factor and making them more robust, because that will just open up the bigger commercial markets for us.
And then finally, it’s accelerating the R&D on the computer. I just walked through the progress that neutral atom has made. It’s really making progress very quickly and I want to compound that advantage. What does that mean in reality? The way we do our R&D is on these things called test-bed systems. They are kind of our internal quantum computers that we are running our R&D experiments on, for lack of a better term. We’re kind of gated by how much you can do on those systems. If we build more systems, we can parallel process that R&D effort. So we’ll build a couple of those.
We’ll have more than $600 million on the balance sheet. I’d like to think of it as we have a — and this is the fourth point — we are just now capitalized to be a player in the market. The fact is, you see what all the publicly traded companies are doing: they’re building a war chest, and their burn rates are a lot higher than ours, so they’ll burn through that more quickly.
But you don’t ever want to be the undercapitalized player in a market like this. So that’s the fourth embedded reason. Not all of it’s going to be for shopping purposes; it’ll be to make sure we have a war chest that might be put to use if needed. And then finally, I think it’s going to be really helpful to have a liquid currency and being able to use our stock to make targeted acquisitions if and when that accelerates our tech road map.
I don’t foresee us doing tangential acquisitions or things like that, but I do see if this photonics company or this type of company helps us accelerate our road map, there might be a good chance to do a tech tuck-in. So that’s how I think about it.
