Turning A Million-Qubit Quantum Computing Dream Into Reality






James Clarke believes quantum computing won’t become practical until the industry is making chips crammed with upwards of a million error-corrected quantum bits.

The goal of making a quantum system with that many qubits isn’t unique to any one company — IBM, Google and startups like PsiQuantum have all stated plans to build such grandiose machines — but Clarke, director of quantum hardware at Intel, thinks the semiconductor giant has a unique advantage in making this reality possible with its manufacturing-driven development approach.

In a peer-reviewed research paper published earlier this year, Intel says it successfully fabricated more than 10,000 arrays, each with three to 55 quantum dots, on a 300-millimeter wafer with a yield higher than 95 percent. This milestone, which the chipmaker achieved in partnership with Dutch research institute QuTech, represented a significantly higher yield and a higher number of qubits than what universities and laboratories, including those used by other companies, have achieved to date.

Clarke says attaining such a feat was non-trivial, made possible in large part by the fact that Intel, unlike most other companies pursuing quantum, runs its own fabs, which the company also used to manufacture the control logic needed that allows such a high density of qubits.

“What we’ve done is we’ve taken the university-like approach for fabricating qubits, and we have used the tools in our toolbox from our advanced transistor fab to make these devices with very high uniformity, very high yield and good performance,” Clarke tells The Next Platform.

When Intel started its quantum efforts in 2015 with QuTech, which is associated with the Delft University of Technology in the Netherlands, the two organizations explored multiple ways of making qubits. One promising avenue was the superconductor qubit, which allowed the company to produce a 17-qubit superconductor test chip in 2017.

But Clarke says eventually Intel and QuTech found greater capabilities with spin qubits, which involves “encoding the zero or one of the qubit into the spin of a single electron.” Each of these electrons are “essentially trapped in the channel of what looks like a transistor,” which is why the chipmaker has been able to use its transistor fabs to make these types of quantum chips.

The decision to forgo the superconductor qubit route, which other organizations are taking, has apparently paid off, according to Clarke, as Intel’s spin qubits are “roughly a million times smaller.”

“So while we’re not there today, over the future, we feel that we will be able to scale a lot faster, get to have a much higher density of qubits in our devices,” he says.

The ability to pack 10,000 arrays of spin qubits into a single wafer comes with an exciting implication for Clarke, even though it’s currently theoretical.

“If we were to produce several of these wafers – or I should say, when we do, when we do this regularly – if we tested them all, we will have created more qubits across those wafers than any company has ever created in the lifetime of their experiments. That would be my assumption,” he says.  “Universities are making these, and their research labs, they’re producing a couple at a time. Even in the superconducting space, I think the count would be a lot smaller.”

The other benefit Intel gets from manufacturing its own quantum chips is that, like other chips it develops, it can run statistical analyses to make further improvements.

“We can feed that information back to our fab to make better devices. We can then cherry-pick the best devices at that stage and feed it forward for further testing. So by having a wafer full of devices, we really get a massive amount of data, which actually allows us to go much faster,” Clarke says.

Even if this allows Intel to go faster, Clarke believes the industry is still roughly a decade away from having a quantum computer that can be used for practical purposes, in areas like cryptography, optimization, chemistry, materials and finance. That may seem like a long time, but when put into perspective with other technologies Intel has developed, the timeline doesn’t seem out of place.

“If you look at the timeline between the first transistor to the first integrated circuit to the first microprocessor, those timelines tend to happen on a 10 to 15 year time frame. And so in every big advancement that Intel has delivered – high-k metal gate, tri-gate – these all happened on a decadal type timeline. That’s not to say that people can’t develop faster, but these are hard things to do. Quantum is harder than making a transistor. So why would we expect that to happen quicker than a typical technology development cycle?” he says.




Leave a Comment

x