Quantum superposition has been used to compare data from two different sources more efficiently than is possible, even in principle, on a conventional computer. The scheme is called “quantum fingerprinting” and has been demonstrated by physicists in China. It could ultimately lead to better large-scale integrated circuits and more energy-efficient communication.
Quantum fingerprinting offers a way of minimizing the amount of information that is transferred between physically separated computers that are working together to solve a problem. It involves two people – Alice and Bob – each sending a file containing n bits of data to a third-party referee, whose job is to judge whether or not the two files are identical. A practical example could be a security system that compares a person’s fingerprint to a digital image.
Proposed theoretically in 2001, quantum fingerprinting can make a comparison in an exponentially more efficient way than is possible using conventional computers. While the only way to ensure a complete comparison is to send the two files in their entirety, it turns out that a reasonably accurate comparison can be achieved by sending just the square root of the number of bits.
Quantum mechanics allows comparisons with even less data because a quantum bit (qubit) of information can exist not just as a zero or a one but, in principle at least, also in an infinite number of intermediate states. The vast increase in the number of possible combinations of states for a given number means that the number of physical bits that need to be transmitted scales logarithmically with the number of bits in the two files. As such, quantum fingerprinting permits an exponential reduction in data-transmission rates over classical algorithms.
The original proposal for quantum fingerprinting involved using log n highly entangled qubits, which Norbert Lütkenhaus of the University of Waterloo in Canada says is still many more qubits than can be implemented using today’s technology. In 2014 he and Juan Miguel Arrazola, now at the National University of Singapore, unveiled a more practical scheme. This involves Alice and Bob encoding their n bits in the optical phase of a series of laser pulses, and then sending those pulses to a beam splitter (the referee). The pairs of pulses arrive at the beam splitter one at a time – if the two pulses have the same phase they exit from one port, whereas opposite phases cause them to leave from a second port. In this way, the two files are judged to be identical if there is no signal at the second port.
The ramp up in efficiency is due to the fact that each pulse can be made from a tiny fraction of a single photon. This means that, on average, the pulses contain less than one photon, which is achieved by attenuating the laser light. This means n pulses can be encoded using just log n photons. As Lütkenhaus points out, the number of photons cannot be made arbitrarily small because there needs to be a reasonable chance that a photon is detected when the phases are different, for the referee to obtain the right answer: that the files are or are not identical. “The scheme gives us an asymptotically accurate result,” he says. “The more photons I put in, the closer I get to the black and white probability.”
Last year, Lütkenhaus and Arrazola, working with Hoi-Kwong Lo, Feihu Xu and other physicists at the University of Toronto, put the scheme into practice by modifying a quantum-key-distribution system sold commercially by the firm ID Quantique in Geneva. They showed that they could match files as large as 100 megabits using less information than is possible with the best-known classical protocol. They did admit, however, that their scheme, while more energy efficient, took more time to carry out.
Now, a group led by Jian-Wei Pan and Qiang Zhang of the University of Science and Technology of China in Hefei has beaten not only the best existing classical protocol but the theoretical classical limit (which is some two orders of magnitude lower). The researchers did so by using more tailor-made equipment – in particular, they employed superconducting rather than standard avalanche photon detectors, which reduced the number of false-positive signals from the beam splitter and so improved the accuracy of the yes/no outputs, and designed a novel kind of interferometer.
Pan and colleagues successfully compared two roughly two-gigabit video files by transmitting just 1300 photons along 20 km of spooled fibre-optic cable, which is about half of what would be needed classically. Next, they plan to test their system by placing Alice, Bob and the referee at different points in a city such as Shanghai.
Despite Pan’s demonstration, Lütkenhaus thinks that quantum fingerprinting probably won’t be commercialized because its superiority over classical systems depends on fairly artificial conditions, such as the referee being unable to talk back to Alice and Bob. However, he says that the research “opens the door” to other, potentially more useful, applications. One example is database searching when the searcher doesn’t have access to the whole database, while the owner of the database can’t see the search terms. “For this, we have made a protocol but not the technology,” he says.
The work is reported on the arXiv preprint server.