Google says quantum computing applications are five years away

3 hours ago 3
Google Quantum chip Willow. Google / Google

A few weeks ago at CES 2025, Nvidia CEO Jensen Huang posited that practical uses of quantum computing were about 20 years away. Today, Google’s head of quantum Hartmut Neven told Reuters that we could see real-world applications of quantum computing within five years. So, who is right?

According to Huang, current quantum systems don’t have enough “qubits.” In fact, they’re short by around five or six orders of magnitude. But why do we need so many? Well, current research suggests that more qubits result in fewer errors, creating more accurate quantum computers. Let’s talk about why that is.

A qubit is just what it sounds like — a quantum bit. It differs from a binary bit in a normal computer because it can encode more data at once. The problem with qubits is that they’re quantum particles — and quantum particles don’t always do what we want. When we run computations on a quantum computer, every one in a thousand qubits “fails” (i.e. stops doing what we want it to do) and throws off the results.

Get your weekly teardown of the tech behind PC gaming

Back in the day, we had a similar problem with traditional computers. The ENIAC computer, for example, used over 17,000 vacuum tubes to represent bits and every couple of days tubes would fail and produce errors. But the solution here was straightforward — we just needed to drop the vacuum tubes and find something that didn’t fail so often. Jump forward a few decades, and we’ve got tiny silicon transistors with a failure rate of one in 1 billion.

For quantum computing, that solution won’t work. Qubits are quantum particles, and quantum particles are what they are. We can’t build them out of something else and we can’t force them to stay in the state we want — we can only find ways to use them as they are.

This is where the “not enough qubits” part becomes relevant. Just last year, Google used its Willow quantum chip to discover that more qubits equals fewer errors. Essentially, Google built mega qubits out of multiple physical qubits, all of which share the same data. This basically creates a system of failsafes — every time one physical qubit fails, there’s another one to keep things on track. The more physical qubits you have, the more failures you can withstand, leaving you with a better chance of getting an accurate result.

However, since qubits fail a lot and we need to achieve a fairly high accuracy rate to start using quantum computers for real-world problems, we’re going to need a whole lot of qubits to get the job done. Huang thinks it will take as many as 20 years to get the numbers we need, while Neven is hinting that he can get there in five.

Does Google know something that Nvidia doesn’t? Is it just fanning the flames of some friendly competition? Right now, we don’t know the answer. Perhaps Neven just wanted to boost quantum computing stocks after Huang’s comments caused a loss of around $8 billion last month.

Whenever the breakthrough does happen, Google thinks it can use quantum computing to build better batteries for electric cars, develop new drugs, and maybe even create new energy alternatives. To claim that such projects could become possible in as few as five years is pretty out there — but I suppose we won’t have to wait too long to find out how right or how wrong Neven is.

Willow Roberts

Willow Roberts has been a Computing Writer at Digital Trends for a year and has been writing for about a decade. She has a…

Nvidia just dropped a big hint about the RTX 50-series release date

nvidia rtx 4080 review 12

Speculation has been running rampant about Nvidia's launch of next-gen RTX 50-series GPUs, but the company itself just dropped a big hint about when they may show up. Nvidia CEO Jensen Huang is set to take the stage as the keynote speaker of CES 2025 on January 6, where there's a good chance we'll hear about Nvidia's next generation of graphics cards.

Although Nvidia is at the annual tech show each year, Huang -- who recently surpassed the worth of all of Intel -- hasn't made an appearance in five years. The executive will likely focus heavily on AI, as it has catapulted Nvidia to become one of the world's wealthiest companies. But RTX 50-series GPUs should make an appearance, too. Nvidia usually takes advantage of CES to launch new graphics cards.

Read more

How to open the Nvidia Control Panel

Nvidia GeForce RTX Gaming Setup with Monitor and PC build.

The Nvidia Control Panel allows you to access all the features of your graphics card, so knowing how to open the Nvidia Control Panel allows you to quickly change your monitor and graphics card settings.

Although the Nvidia Control Panel isn't readily apparent on your desktop, opening it is simple. We have a handful of ways to access it, as well as some tips for how to get the most out of the software.
How to open the Nvidia Control Panel

Read more

Nvidia’s most underrated DLSS feature deserves far more attention

Alan Wake 2 running on the Samsung Odyssey OELD G9.

Since the introduction of Nvidia's Deep Learning Super Sampling (DLSS), the company has done an excellent job getting the feature in as many games as possible. As the standout feature of Nvidia's best graphics cards, most major game releases come with the feature at the ready.

That's only become truer with the introduction of DLSS 3 and its Frame Generation feature, showing up in recent releases like Ghost of Tsushima and The First Descendent. But one DLSS feature has seen shockingly low representation.

Read more

Read Entire Article