[ad_1]
Google has unveiled a new quantum computer and is once more claiming to have pulled ahead in the race to show that these exotic machines can beat even the world’s best conventional supercomputers – so does that mean useful quantum computers are finally here?
Researchers at the tech giant were the first in the world to demonstrate this feat, known as quantum supremacy, with the announcement of the Sycamore quantum computing chip in 2019. But since then, supercomputers have caught up, leaving Sycamore behind. Now, Google has produced a new quantum chip, called Willow, which Julian Kelly at Google Quantum AI says is the firm’s best yet.
“You can think of this as having all the advantages of Sycamore, but if you were to look under the hood, we changed the geometry… we reimagined the processor,” he says.
While the most advanced version of Sycamore boasted 67 qubits, or quantum bits, to process information, Willow has been upgraded to 105. Ideally, larger quantum computers should also be more powerful, but researchers have found that the qubits in larger devices struggle to remain coherent, losing their quantumness. This has also been seen by competitors IBM and California-based start-up Atom Computing, which both recently debuted quantum computers with more than 1000 qubits.
Kelly says that because of this, qubit quality has been a big focus for the team, and that Willow’s qubits can preserve their intricate quantum states – and therefore reliably encode information – more than five times longer than Sycamore’s can.
Google uses a specific benchmarking task called RCS to assess its quantum computers’ performance, which Willow excelled at, says Hartmut Neven, also at Google Quantum AI. The task involves verifying that a sample of numbers output by a program run on the chip have as random a distribution as possible. For several years, Sycamore could do this faster than the world’s best supercomputers, but in 2022, and then again in 2024, new records were set by conventional computers.
Google says Willow has again widened the gap between quantum and traditional machines, as the task took 5 minutes on the chip, while the firm estimates that it would take 10 septillion years, or much more than the age of the universe squared, on a leading supercomputer.
In this comparison, the researchers modelled a version of the Frontier supercomputer (which was recently downgraded to only the second-most powerful supercomputer in the world) with more memory than it is currently able to use, which only underscores the computational power of Willow, says Neven. While Sycamore’s records were broken, he is confident that Willow will maintain its champion status for much longer as conventional computing methods reach their limits.
What still isn’t clear is whether Willow can actually do anything useful, given the RCS benchmarking test has no practical application. Kelly says succeeding at the benchmark is a “necessary but not sufficient” condition for the usefulness of a quantum computer, though any chip that fails to be great at RCS doesn’t stand a chance of being practical later.
But the Google team has another reason to believe in Willow’s bright future – it is very good at correcting its own errors. The propensity of quantum computers to make errors is one of the biggest issues currently preventing them from delivering on the promise of being more powerful than any other type of computer. To improve this, researchers, including Google’s team, group physical qubits together to form “logical qubits”, which are much more resilient to errors.
With Willow, the team showed that as the logical qubits were made larger, they got better at preventing errors, making around half as many errors as the physical qubits that comprised them. What’s more, that error rate further halved when the logical qubits were roughly doubled in size. In this way, the Google researchers reached a threshold where they believe they can keep increasing the number of qubits – making larger and larger quantum computers – and have them get better and better at running calculations, which hasn’t been a trend so far.
“This is, in my opinion, a signature result, and while we are still a long way from demonstrating a practical quantum computer, it is an important and necessary step towards that goal,” says Andrew Cleland at the University of Chicago.
Martin Weides at the University of Glasgow, UK, says that the new work sets out a route towards building “fault-tolerant” quantum computers – those that could catch and correct all of their errors. Challenges remain, but these advancements pave the way for transformative applications in quantum chemistry, such as drug discovery and materials design, he says, as well as in cryptography and machine learning.
The focus on error correction across academic labs and the burgeoning quantum computing industry has made advances in logical qubits an important point of comparison between today’s best quantum computers. In 2023, a team of researchers at Harvard University and start-up QuEra used qubits made from extremely cold rubidium atoms to set the record for the most logical qubits ever created. Earlier this year, researchers at Microsoft and Atom Computing linked a record-breaking number of logical qubits through quantum entanglement.
Google’s approach is different because it focuses on making single logical qubits larger and larger, as well as better and better, instead of maximising their number. “We could divide our chip into smaller and smaller logical qubits and run algorithms, but we really wanted to get to this threshold. This is where all the underlying challenges with science and engineering [of quantum computing] are,” says Kelly.
Ultimately, however, the biggest test of Willow’s impact will be whether it can meet the goal that all other quantum computers are also chasing – to reliably compute something that is useful but not possible on any conventional computer. Neven says Sycamore had already been used to make scientific discoveries, such as in quantum physics, but the team is setting its sights on more real-world applications with Willow. “We are moving towards new calculations and simulations that classical computers could not do.”
Topics:
[ad_2]