QT/ Symmetric graphene quantum dots for future qubits

May 11th 2023

Quantum news biweekly vol.51, 27th April — 11th May

article image

TL;DR

  • Quantum dots in semiconductors such as silicon or gallium arsenide have long been considered hot candidates for hosting quantum bits in future quantum processors. Scientists have now shown that bilayer graphene has even more to offer here than other materials. The double quantum dots they have created are characterized by a nearly perfect electron-hole-symmetry that allows a robust read-out mechanism — one of the necessary criteria for quantum computing.
  • An international team of researchers has developed a comprehensive manual for engineering spin dynamics in nanomagnets — an important step toward advancing spintronic and quantum-information technologies.
  • Researchers have demonstrated a prototype lidar system that uses quantum detection technology to acquire 3D images while submerged underwater. The high sensitivity of this system could allow it to capture detailed information even in extremely low-light conditions found underwater.
  • Using a “spooky” phenomenon of quantum physics, researchers have discovered a way to double the resolution of light microscopes.
  • Large numbers can only be factorized with a great deal of computational effort. Physicists are now providing a blueprint for a new type of quantum computer to solve the factorization problem, which is a cornerstone of modern cryptography.
  • Researchers raise fundamental questions about the proposed value of topological protection against backscattering in integrated photonics.
  • In a unique analysis of experimental data, nuclear physicists have made observations of how lambda particles, so-called ‘strange matter,’ are produced by a specific process called semi-inclusive deep inelastic scattering (SIDIS). What’s more, these data hint that the building blocks of protons, quarks and gluons, are capable of marching through the atomic nucleus in pairs called diquarks, at least part of the time.
  • By superimposing two laser fields of different strengths and frequency, the electron emission of metals can be measured and controlled precisely to a few attoseconds. Physicists have shown that this is the case. The findings could lead to new quantum-mechanical insights and enable electronic circuits that are a million times faster than today.
  • A team of researchers has demonstrated the ultimate sensitivity allowed by quantum physics in measuring the time delay between two photons. This breakthrough has significant implications for a range of applications, including more feasible imaging of nanostructures, including biological samples, and nanomaterial surfaces, as well as quantum enhanced estimation based on frequency-resolved boson sampling in optical networks.
  • A team of physicists has illuminated certain properties of quantum systems by observing how their fluctuations spread over time. The research offers an intricate understanding of a complex phenomenon that is foundational to quantum computing.
  • And more!

 

Quantum Computing Market

According to the recent market research report ‘Quantum Computing Market with COVID-19 impact by Offering (Systems and Services), Deployment (On Premises and Cloud Based), Application, Technology, End-use Industry and Region — Global Forecast to 2026’, published by MarketsandMarkets, the Quantum Computing market is expected to grow from USD 472 million in 2021 to USD 1,765 million by 2026, at a CAGR of 30.2%. The early adoption of quantum computing in the banking and finance sector is expected to fuel the growth of the market globally. Other key factors contributing to the growth of the quantum computing market include rising investments by governments of different countries to carry out research and development activities related to quantum computing technology. Several companies are focusing on the adoption of QCaaS post-COVID-19. This, in turn, is expected to contribute to the growth of the quantum computing market. However, stability and error correction issues are expected to restrain the growth of the market.

According to ‘Quantum Computing Market Research Report: By Offering, Deployment Type, Application, Technology, Industry — Industry Share, Growth, Drivers, Trends and Demand Forecast to 2030’ report, the quantum computing market is projected to reach $64,988 million by 2030. Machine learning (ML) is expected to progress at the highest CAGR, during the forecast period, among all application categories, owing to the fact that quantum computing is being integrated in ML for improving the latter’s use case.

 

Latest Research

Particle–hole symmetry protects spin-valley blockade in graphene quantum dots

by L. Banszerus, S. Möller, K. Hecker, E. Icking, K. Watanabe, T. Taniguchi, F. Hassler, C. Volk, C. Stampfer in Nature

Quantum dots in semiconductors such as silicon or gallium arsenide have long been considered hot candidates for hosting quantum bits in future quantum processors. Scientists at Forschungszentrum Jülich and RWTH Aachen University have now shown that bilayer graphene has even more to offer here than other materials. The double quantum dots they have created are characterized by a nearly perfect electron-hole-symmetry that allows a robust read-out mechanism — one of the necessary criteria for quantum computing.

The development of robust semiconductor spin qubits could help the realization of large-scale quantum computers in the future. However, current quantum dot based qubit systems are still in their infancy. In 2022, researchers at QuTech in the Netherlands were able to create 6 silicon-based spin qubits for the first time. With graphene, there is still a long way to go. The material, which was first isolated in 2004, is highly attractive to many scientists. But the realization of the first quantum bit has yet to come.

“Bilayer graphene is a unique semiconductor,” explains Prof. Christoph Stampfer of Forschungszentrum Jülich and RWTH Aachen University. “It shares several properties with single-layer graphene and also has some other special features. This makes it very interesting for quantum technologies.”

Charge stability diagrams for opposite bias voltages in DQD #1.

One of these features is that it has a bandgap that can be tuned by an external electric field from zero to about 120 milli-electronvolt. The band gap can be used to confine charge carriers in individual areas, so-called quantum dots. Depending on the applied voltage, these can trap a single electron or its counterpart, a hole — basically a missing electron in the solid-state structure. The possibility of using the same gate structure to trap both electrons and holes is a feature that has no counter part in conventional semiconductors.

“Bilayer graphene is still a fairly new material. So far, mainly experiments that have already been realized with other semiconductors have been carried out with it. Our current experiment now goes really beyond this for the first time,” Christoph Stampfer says. He and his colleagues have created a so-called double quantum dot: two opposing quantum dots, each housing an electron and a hole whose spin properties mirror each other almost perfectly.

Additional data set for another electron-hole double quantum dot (DQD #2) in the same device.

“This symmetry has two remarkable consequences: it is almost perfectly preserved even when electrons and holes are spatially separated in different quantum dots,” Stampfer said. This mechanism can be used to couple qubits to other qubits over a longer distance. And what’s more, “the symmetry results in a very robust blockade mechanism which could be used to read out the spin state of the dot with high fidelity.”

“This goes beyond what can be done in conventional semiconductors or any other two-dimensional electron systems,” says Prof. Fabian Hassler of the JARA Institute for Quantum Information at Forschungszentrum Jülich and RWTH Aachen University, co-author of the study. “The near-perfect symmetry and strong selection rules are very attractive not only for operating qubits, but also for realizing single-particle terahertz detectors. In addition, it lends itself to coupling quantum dots of bilayer graphene with superconductors, two systems in which electron-hole symmetry plays an important role. These hybrid systems could be used to create efficient sources of entangled particle pairs or artificial topological systems, bringing us one step closer to realizing topological quantum computers.”

 

Controlling Selection Rules for Magnon Scattering in Nanomagnets by Spatial Symmetry Breaking

by Arezoo Etesamirad, Julia Kharlan, Rodolfo Rodriguez, Igor Barsukov, Roman Verba in Physical Review Applied

An international team of researchers at the University of California, Riverside, and the Institute of Magnetism in Kyiv, Ukraine, has developed a comprehensive manual for engineering spin dynamics in nanomagnets — an important step toward advancing spintronic and quantum-information technologies.

Despite their small size, nanomagnets — found in most spintronic applications — reveal rich dynamics of spin excitations, or “magnons,” the quantum-mechanical units of spin fluctuations. Due to its nanoscale confinement, a nanomagnet can be considered to be a zero-dimensional system with a discrete magnon spectrum, similar to the spectrum of an atom.

“The magnons interact with each other, thus constituting nonlinear spin dynamics,” said Igor Barsukov, an assistant professor of physics and astronomy at UC Riverside and a corresponding author on the study. “Nonlinear spin dynamics is a major challenge and a major opportunity for improving the performance of spintronic technologies such as spin-torque memory, oscillators, and neuromorphic computing.”

(a) The sample model is a thin elliptical disk in a bias magnetic field Be. (b) Bias field dependence of the first six spin-wave modes’ eigenfrequencies for Be∥ex.

Barsukov explained that the interaction of magnons follows a set of rules — the selection rules. The researchers have now postulated these rules in terms of symmetries of magnetization configurations and magnon profiles. The new work continues the efforts to tame nanomagnets for next-generation computation technologies. In a previous publication, the team demonstrated experimentally that symmetries can be used for engineering magnon interactions.

“We recognized the opportunity, but also noticed that much work needed to be done to understand and formulate the selection rules,” Barsukov said.

According to the researchers, a comprehensive set of rules reveals the mechanisms behind the magnon interaction.

“It can be seen as a guide for spintronics labs for debugging and designing nanomagnet devices,” said Arezoo Etesamirad, the first author of the paper who worked in the Barsukov lab and recently graduated with a doctoral degree in physics. “It lays the foundation for developing an experimental toolset for tunable magnetic neurons, switchable oscillators, energy-efficient memory, and quantum-magnonic and other next-generation nanomagnetic applications.”

 

Submerged single-photon LiDAR imaging sensor used for real-time 3D scene reconstruction in scattering underwater environments

by Aurora Maccarone, Kristofer Drummond, Aongus McCarthy, Ulrich K. Steinlehner, Julian Tachella, Diego Aguirre Garcia, Agata Pawlikowska, Robert A. Lamb, Robert K. Henderson, Stephen McLaughlin, Yoann Altmann, Gerald S. Buller in Optics Express

For the first time, researchers have demonstrated a prototype lidar system that uses quantum detection technology to acquire 3D images while submerged underwater. The high sensitivity of this system could allow it to capture detailed information even in extremely low-light conditions found underwater.

“This technology could be useful for a wide range of applications,” said research team member Aurora Maccarone, a Royal Academy of Engineering research fellow from Heriot-Watt University in the United Kingdom. “For example, it could be used to inspect underwater installations, such as underwater wind farm cables and the submerged structure of the turbines. Underwater lidar can also be used for monitoring or surveying submerged archaeology sites and for security and defense applications.”

Obtaining 3D images through ocean water can be challenging because it is light-limited, and any particles in the water will scatter light and distort the image. However, single-photon detection, which is a quantum-based technique, allows very high penetration and works even in low-light conditions.

Researchers from Heriot-Watt University and the University of Edinburgh describe experiments in which an entire single-photon lidar system was submerged in a large water tank. The new demonstrations bring the technology closer to practical applications compared to the research team’s earlier experiments with underwater single-photon detection, which were performed in carefully controlled laboratory conditions with the optical setup placed outside the water tank and data analysis performed offline. They also implemented new hardware and software developments that allow the 3D images acquired by the system to be reconstructed in real time.

“This work aims to make quantum detection technologies available for underwater applications, which means that we will be able to image the scene of interest in very low light conditions,” said Maccarone. “This will impact the use of offshore cable and energy installations, which are used by everyone. This technology could also allow monitoring without the presence of humans, which would mean less pollution and a less invasive presence in the marine environment.”

(a) Schematic of the underwater transceiver. The optical setup included a fiber collimation package (FCP), an optical diffuser (D), lenses (L1 and L2), band pass filters (BP), and the SPAD detector array. The optical setup was placed in a watertight enclosure which was connected to the equipment outside the tank via an umbilical cord. (b) Photograph of the optical setup based on the SPAD detector array.

Lidar systems create images by measuring how long it takes laser light to be reflected from objects in the scene and travel back to the system’s receiver, known as the “time of flight.” In the new work, the researchers sought to develop a way to acquire 3D images of targets that are obscured by turbid water and thus not visible to conventional lidar imaging systems. They designed a lidar system that uses a green pulsed laser source to illuminate the scene of interest. The reflected pulsed illumination is detected by an array of single-photon detectors, which allows ultrafast low light detection and greatly reduces measurement time in photon-starved environments such as highly attenuating water.

“By taking time-of-flight measurements with picosecond timing resolution, we can routinely resolve millimeter details of the targets in the scene,” said Maccarone. “Our approach also allows us to distinguish the photons reflected by the target from those reflected by particles in the water, making it particularly suitable to performing 3D imaging in highly turbid waters where optical scattering can ruin image contrast and resolution.”

The fact that this approach requires thousands of single-photon detectors, all producing many hundreds of events per second, makes it extremely challenging to retrieve and process the data necessary to reconstruct the 3D image in a short time, especially for real-time applications. To solve this problem, the researchers developed algorithms specifically for imaging in highly scattering conditions and applied them in conjunction with widely available graphics processing unit (GPU) hardware.

The new technique builds on some important technological advances. “Heriot-Watt University has a long track record in single-photon detection techniques and image processing of single-photon data, which allowed us to demonstrate advanced single?photon imaging in extremely challenging conditions,” said Maccarone. “The University of Edinburgh has achieved fundamental advances in the design and fabrication of single-photon avalanche diode detector arrays, which allowed us to build compact and robust imaging systems based on quantum detection technologies.”

After optimizing the optical setup on a laboratory optical bench, the researchers connected the lidar system to a GPU to achieve real-time processing of the data while also implementing a number of image processing approaches for three-dimensional imaging. Once the system was working properly, they moved it to a tank that was 4 meters long, 3 meters wide, and 2 meters deep. With the system submerged in the water, the researchers added a scattering agent in a controlled manner to make the water more turbid. Experiments at three different turbidity levels demonstrated successful imaging in controlled highly scattering scenarios at distances of 3 meters.

“Single-photon technologies are rapidly developing, and we have demonstrated very promising results in underwater environments,” said Maccarone. “The approach and image processing algorithms could also be used in a wider range of scenarios for improved vision in free space such as in fog, smoke or other obscurants.”

 

Quantum microscopy of cells at the Heisenberg limit

by Zhe He, Yide Zhang, Xin Tong, Lei Li, Lihong V. Wang in Nature Communications

Using a “spooky” phenomenon of quantum physics, Caltech researchers have discovered a way to double the resolution of light microscopes.

In a paper, a team led by Lihong Wang, Bren Professor of Medical Engineering and Electrical Engineering, shows the achievement of a leap forward in microscopy through what is known as quantum entanglement. Quantum entanglement is a phenomenon in which two particles are linked such that the state of one particle is tied to the state of the other particle regardless of whether the particles are anywhere near each other. Albert Einstein famously referred to quantum entanglement as “spooky action at a distance” because it could not be explained by his relativity theory.

According to quantum theory, any type of particle can be entangled. In the case of Wang’s new microscopy technique, dubbed quantum microscopy by coincidence (QMC), the entangled particles are photons. Collectively, two entangled photons are known as a biphoton, and, importantly for Wang’s microscopy, they behave in some ways as a single particle that has double the momentum of a single photon.

Coincidence measurement of QMC.

Since quantum mechanics says that all particles are also waves, and that the wavelength of a wave is inversely related to the momentum of the particle, particles with larger momenta have smaller wavelengths. So, because a biphoton has double the momentum of a photon, its wavelength is half that of the individual photons. This is key to how QMC works. A microscope can only image the features of an object whose minimum size is half the wavelength of light used by the microscope. Reducing the wavelength of that light means the microscope can see even smaller things, which results in increased resolution.

Quantum entanglement is not the only way to reduce the wavelength of light being used in a microscope. Green light has a shorter wavelength than red light, for example, and purple light has a shorter wavelength than green light. But due to another quirk of quantum physics, light with shorter wavelengths carries more energy. So, once you get down to light with a wavelength small enough to image tiny things, the light carries so much energy that it will damage the items being imaged, especially living things such as cells. This is why ultraviolet (UV) light, which has a very short wavelength, gives you a sunburn. QMC gets around this limit by using biphotons that carry the lower energy of longer-wavelength photons while having the shorter wavelength of higher-energy photons.

“Cells don’t like UV light,” Wang says. “But if we can use 400-nanometer light to image the cell and achieve the effect of 200-nm light, which is UV, the cells will be happy, and we’re getting the resolution of UV.”

To achieve that, Wang’s team built an optical apparatus that shines laser light into a special kind of crystal that converts some of the photons passing through it into biphotons. Even using this special crystal, the conversion is very rare and occurs in about one in a million photons. Using a series of mirrors, lenses, and prisms, each biphoton — which actually consists of two discrete photons — is split up and shuttled along two paths, so that one of the paired photons passes through the object being imaged and the other does not. The photon passing through the object is called the signal photon, and the one that does not is called the idler photon. These photons then continue along through more optics until they reach a detector connected to a computer that builds an image of the cell based on the information carried by the signal photon. Amazingly, the paired photons remain entangled as a biphoton behaving at half the wavelength despite the presence of the object and their separate pathways.

Wang’s lab was not the first to work on this kind of biphoton imaging, but it was the first to create a viable system using the concept. “We developed what we believe a rigorous theory as well as a faster and more accurate entanglement-measurement method. We reached microscopic resolution and imaged cells.”

While there is no theoretical limit to the number of photons that can be entangled with each other, each additional photon would further increase the momentum of the resulting multiphoton while further decreasing its wavelength.

Wang says future research could enable entanglement of even more photons, although he notes that each extra photon further reduces the probability of a successful entanglement, which, as mentioned above, is already as low as a one-in-a-million chance.

 

Scalable set of reversible parity gates for integer factorization

by Martin Lanthaler, Benjamin E. Niehoff, Wolfgang Lechner in Communications Physics

Today’s computers are based on microprocessors that execute so-called gates. A gate can, for example, be an AND operation, i.e. an operation that adds two bits. These gates, and thus computers, are irreversible. That is, algorithms cannot simply run backwards.

“If you take the multiplication 2*2=4, you cannot simply run this operation in reverse, because 4 could be 2*2, but likewise 1*4 or 4*1,” explains Wolfgang Lechner, professor of theoretical physics at the University of Innsbruck. If this were possible, however, it would be feasible to factorize large numbers, i.e. divide them into their factors, which is an important pillar of cryptography.

Martin Lanthaler, Ben Niehoff and Wolfgang Lechner from the Department of Theoretical Physics at the University of Innsbruck and the quantum spin-off ParityQC have now developed exactly this inversion of algorithms with the help of quantum computers. The starting point is a classical logic circuit, which multiplies two numbers. If two integers are entered as the input value, the circuit returns their product. Such a circuit is built from irreversible operations.

Parity gates.

“However, the logic of the circuit can be encoded within ground states of a quantum system,” explains Martin Lanthaler from Wolfgang Lechner’s team. “Thus, both multiplication and factorization can be understood as ground-state problems and solved using quantum optimization methods.”

“The core of our work is the encoding of the basic building blocks of the multiplier circuit, specifically AND gates, half and full adders with the parity architecture as the ground state problem on an ensemble of interacting spins,” says Martin Lanthaler. The coding allows the entire circuit to be built from repeating subsystems that can be arranged on a two-dimensional grid. By stringing several of these subsystems together, larger problem instances can be realized. Instead of the classical brute force method, where all possible factors are tested, quantum methods can speed up the search process: To find the ground state, and thus solve an optimization problem, it is not necessary to search the whole energy landscape, but deeper valleys can be reached by “tunneling.”

 

Observation of strong backscattering in valley-Hall photonic topological interface modes

by Christian Anker Rosiek, Guillermo Arregui, Anastasiia Vladimirova, Marcus Albrechtsen, Babak Vosoughi Lahijani, Rasmus Ellebæk Christiansen, Søren Stobbe in Nature Photonics

The field of integrated photonics has taken off in recent years. These microchips utilise light particles (photons) in their circuitry as opposed to the electronic circuits that, in many ways, form the backbone of our modern age. Offering improved performance, reliability, energy efficiency, and novel functionalities, integrated photonics has immense potential and is fast becoming a part of the infrastructure in data centres and telecom systems, while also being a promising contender for a wide range of sensors and integrated quantum technologies.

Significant improvements in nanoscale fabrication have made it possible to build photonic circuits with minimal defects, but defects can never be entirely avoided, and losses due to disorder remains a limiting factor in today’s technology. Minimising these losses could, for example, reduce the energy consumption in communication systems and further improve the sensitivity of sensor technology. And since photonic quantum technologies rely on encoding information in fragile quantum states, minimising losses is essential to scale quantum photonics to real applications. So the search is on for new ways to reduce the backscattering, or even prevent it entirely. One suggestion for minimising the loss of photons in an integrated photonic system is to guide the light through the circuit using topological interfaces that prevent backscattering by design.

“It would be very nice if it were possible to reduce losses in these systems. But fundamentally, creating such a one-way street for photons is a tough thing to do. In fact, as of right now, it is impossible; to do this in the optical domain would require developing new materials that do not exist today,” says Associate Professor Søren Stobbe, Group Leader at DTU Electro.

VH photonic crystals and interface states.

Circuitry built from topological insulators would, in theory, force photons to keep moving forward, never backward. The backwards channel would simply not exist. While such effects are well-known in niche electronics and have been demonstrated with microwaves, they have yet to be shown in the optical domain. But full topological protection is impossible in silicon and all other low-loss photonic materials, because they are subject to time-reversal symmetry. This means that whenever a waveguide allows transmitting light in one direction, the backwards path is also possible. This means that there is no one-way street for photons in conventional materials, but researchers have hypothesized that a two-way street would already be good enough to prevent backscattering.

“There has been a lot of work trying to realise topological waveguides in platforms relevant for integrated photonics. One of the most interesting platforms is silicon photonics, which uses the same materials and technology that make up today’s ubiquity of computer chips to build photonic systems, and even if disorder cannot be entirely eliminated, perhaps backscattering can,” says Søren Stobbe.

Although several previous studies have found that it may be possible to prevent backscattering based on various indirect observations, rigorous measurements of the losses and the backscattering in topological waveguides were so far missing. The central experiments conducted at DTU were performed on a highly well-characterised state-of-the-art type of silicon waveguide, showing that even in the best waveguides available, the topological waveguides show no protection against backscattering.

“We fabricated the best waveguide obtainable with current technology — reporting the smallest losses ever seen and reaching minute levels of structural disorder — but we never saw topological protection against backscattering. If the two-way topological insulators protect against backscattering, they would only be effective at disorder levels below what is possible today,” says PhD-student Christian Anker Rosiek. He conducted most of the fabrication, experiments and data analysis along with postdoc Guillermo Arregui, both at DTU Electro.

“Measuring the losses alone is crucial, but not enough, because losses can also come from radiation out of the waveguide. We can see from our experiments that the photons get caught in little randomly located cavities in the waveguide as if many of tiny mirrors had been randomly placed in the light’s path. Here, the light is reflected back and forth, scattering very strongly on those defects. It shows that the backscattering strength is high, even in a state-of-the-art system, proving that backscattering is the limiting factor,” says Guillermo Arregui.

The study concludes that, for a waveguide to offer protection against backscattering, you would need the topological insulator to be constructed from materials that break time-reversal symmetry without absorbing light. Such materials do not exist today.

“We are not ruling out that protection from backscattering can work, and absence of evidence must not be confused with evidence of absence. There is plenty of exciting research to be explored within topological physics, but moving forward, I believe researchers should take great care in measuring losses when presenting new topological waveguides. That way, we will get a clearer picture of the true potential of these structures. Suppose someone does indeed develop new, exotic materials that allow only propagation in one direction, our study has established the tests needed to claim real protection against backscattering,” says Christian Anker Rosiek.

 

First Measurement of Λ Electroproduction off Nuclei in the Current and Target Fragmentation Regions

by T. Chetry, L. El Fassi, W. K. Brooks, R. Dupré, et al in Physical Review Letters

In a unique analysis of experimental data, nuclear physicists have made the first-ever observations of how lambda particles, so-called “strange matter,” are produced by a specific process called semi-inclusive deep inelastic scattering (SIDIS). What’s more, these data hint that the building blocks of protons, quarks and gluons, are capable of marching through the atomic nucleus in pairs called diquarks, at least part of the time. These results come from an experiment conducted at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility.

It’s a result that has been decades in the making. The dataset was originally collected in 2004. Lamiaa El Fassi, now an associate professor of physics at Mississippi State University and principal investigator of the work, first analyzed these data during her thesis project to earn her graduate degree on a different topic. Nearly a decade after completing her initial research with these data, El Fassi revisited the dataset and led her group through a careful analysis to yield these unprecedented measurements. The dataset comes from experiments in Jefferson Lab’s Continuous Electron Beam Accelerator Facility (CEBAF), a DOE user facility. In the experiment, nuclear physicists tracked what happened when electrons from CEBAF scatter off the target nucleus and probe the confined quarks inside protons and neutrons.

“These studies help build a story, analogous to a motion picture, of how the struck quark turns into hadrons. In a new paper, we report first-ever observations of such a study for the lambda baryon in the forward and backward fragmentation regions,” El Fassi said.

An illustration of the hadronization process, as well as its production τp and formation τf timescales.

Like the more familiar protons and neutrons, each lambda is made up of three quarks. Unlike protons and neutrons, which only contain a mixture of up and down quarks, lambdas contain one up quark, one down quark and one strange quark. Physicists have dubbed matter that contains strange quarks “strange matter.” In this work, El Fassi and her colleagues studied how these particles of strange matter form from collisions of ordinary matter. To do so, they shot CEBAF’s electron beam at different targets, including carbon, iron, and lead. When a high-energy electron from CEBAF reaches one of these targets, it breaks apart a proton or neutron inside one of the target’s nuclei.

“Because the proton or neutron is totally broken apart, there is little doubt that the electron interacts with the quark inside,” El Fassi said.

After the electron interacts with a quark or quarks via an exchanged virtual photon, the “struck” quark(s) begins moving as a free particle in the medium, typically joining up with other quark(s) it encounters to form a new composite particle as they propagate through the nucleus. And some of the time, this composite particle will be a lambda. But the lambda is short-lived — after formation, it will swiftly decay into two other particles: a pion and either a proton or neutron. To measure different properties of these briefly created lambda particles, physicists must detect its two daughter particles, as well as the beam electron that scattered off the target nucleus.

This work is the first to measure the lambda using this process, which is known as semi-inclusive deep inelastic scattering, in the forward and backward fragmentation regions. It’s more difficult to use this method to study lambda particles, because the particle decays so quickly, it can’t be measured directly.

“This class of measurement has only been performed on protons before, and on lighter, more stable particles,” said coauthor William Brooks, professor of physics at Federico Santa María Technical University and co-spokesperson of the EG2 experiment.

The analysis was so challenging, it took several years for El Fassi and her group to re-analyze the data and extract these results. It was her thesis advisor, Kawtar Hafidi, who encouraged her to pursue the investigation of the lambda from these datasets.

“I would like to commend Lamiaa’s hard work and perseverance in dedicating years of her career working on this,” said Hafidi, associate laboratory director for physical sciences and engineering at Argonne National Lab and co-spokesperson of the EG2 experiment. “Without her, this work would not have seen fruition.”

“It hasn’t been easy,” El Fassi said. “It’s a long and time-consuming process, but it was worth the effort. When you spend so many years working on something, it feels good to see it published.”

El Fassi began this lambda analysis when she herself was a postdoc, a couple of years prior to becoming an assistant professor at Mississippi State University. Along the way, several of her own postdocs at Mississippi State have helped extract these results, including coauthor Taya Chetry.

“I’m very happy and motivated to see this work being published,” said Chetry, who is now a postdoctoral researcher at Florida International University.

Left: acceptance-weighted (p,π−) invariant mass distributions for the Fe/LD2 (top/bottom) targets. Blue curves represent the roofit χ2 minimization using a simple Breit-Wigner (BW) function for the Λ signal and event mixing for the combinatorial background (red dotted curves). The green distributions are the fit results that are integrated to obtain the Λ yields. Right: comparison of Fe (red) and LD2 (blue) acceptance-weighted pT/p2T (top/bottom) normalized distributions to their peak height.

A notable finding from this intensive analysis changes the way physicists understand how lambdas form in the wake of particle collisions. In similar studies that have used semi-inclusive deep inelastic scattering to study other particles, the particles of interest usually form after a single quark was “struck” by the virtual photon exchanged between the electron beam and the target nucleus. But the signal left by lambda in the CLAS detector suggests a more packaged deal. The authors’ analysis showed that when forming a lambda, the virtual photonhas been absorbed part of the time by a pair of quarks, known as a diquark, instead of just one. After being “struck,” this diquark went on to find a strange quark and forms a lambda.

“This quark pairing suggests a different mechanism of production and interaction than the case of the single quark interaction,” Hafidi said.

A better understanding of how different particles form helps physicists in their effort to decipher the strong interaction, the fundamental force that holds these quark-containing particles together. The dynamics of this interaction are very complicated, and so is the theory used to describe it: quantum chromodynamics (QCD). Comparing measurements to models of QCD’s predictions allows physicists to test this theory. Because the diquark finding differs from the model’s current predictions, it suggests something about the model is off.

“There is an unknown ingredient that we don’t understand. This is extremely surprising, since the existing theory can describe essentially all other observations, but not this one,” Brooks said. “That means there is something new to learn, and at the moment, we have no clue what it could be.”

To find out, they’ll need even more measurements. Data for EG2 were collected with 5.014 GeV (billion electron-volt) electron beams in the CEBAF’s 6 GeV era. Future experiments will use electron beams from the updated CEBAF, which now extend up to 11 GeV for Experimental Hall B, as well as an updated CLAS detector known as CLAS12, to continue studying the formation of a variety of particles, including lambdas, with higher-energy electrons. The upcoming Electron-Ion Collider (EIC) at DOE’s Brookhaven National Laboratory will also provide a new opportunity to continue studying this strange matter and quark pairing structure of the nucleon with greater precision.

“These results lay the groundwork for upcoming studies at the upcoming CLAS12 and the planned EIC experiments, where one can investigate the diquark scattering in greater detail,” Chetry said.

El Fassi is also a co-spokesperson for CLAS12 measurements of quark propagation and hadron formation. When data from the new experiments is finally ready, physicists will compare it to QCD predictions to further refine this theory.

“Any new measurement that will give novel information toward understanding the dynamics of strong interactions is very important,” she said.

 

Tracing attosecond electron emission from a nanometric metal tip

by Philip Dienstbier, Lennart Seiffert, Timo Paschen, Andreas Liehl, Alfred Leitenstorfer, Thomas Fennel, Peter Hommelhoff in Nature

By superimposing two laser fields of different strengths and frequency, the electron emission of metals can be measured and controlled precisely to a few attoseconds. Physicists from Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), the University of Rostock and the University of Konstanz have shown that this is the case. The findings could lead to new quantum-mechanical insights and enable electronic circuits that are a million times faster than today.

Light is capable of releasing electrons from metal surfaces. This observation was already made in the first half of the 19th century by Alexandre Edmond Becquerel and later confirmed in various experiments, among others by Heinrich Hertz and Wilhelm Hallwachs. Since the photoelectric effect could not be reconciled with the light wave theory, Albert Einstein came to the conclusion that light must consist not only of waves, but also of particles. He laid the foundation for quantum mechanics.

With the development of laser technology, research into the photoelectric effect has gained a new impetus. “Today, we can produce extremely strong and ultrashort laser pulses in a wide variety of spectral colors,” explains Prof. Dr. Peter Hommelhoff, Chair for Laser Physics at the Department of Physics at FAU. “This inspired us to capture and control the duration and intensity of the electron release of metals with greater accuracy.” So far, scientists have only been able to determine laser-induced electron dynamics precisely in gases — with an accuracy of a few attoseconds. Quantum dynamics and emission time windows have not yet been measured on solids.

This is exactly what the researchers at FAU, the University of Rostock and the University of Konstanz have now succeeded in doing for the first time. They used a special strategy for this: Instead of just a strong laser pulse, which emits the electrons a pointy tungsten tip, they also used a second weaker laser with twice the frequency.

“In principle, you have to know that with very strong laser light, the individual photons are no longer responsible for the release of the electrons, but rather the electric field of the laser,” explains Dr. Philip Dienstbier, a research associate at Peter Hommelhoff’s chair and leading author of the study. “The electrons then tunnel through the metal interface into the vacuum.”

By deliberately superimposing the two light waves, physicists can control the shape and strength of the laser field — and thus also the emission of the electrons.

Relation between optimal phase and work function.

In the experiment, the researchers were able to determine the duration of the electron flow to 30 attoseconds — thirty billionths of a billionth of a second. This ultra-precise limitation of the emission time window could advance basic and application-related research in equal measure.

“The phase shift of the two laser pulses allows us to gain deeper insights into the tunnel process and the subsequent movement of the electron in the laser field,” says Philip Dienstbier. “This enables new quantum mechanical insights into both the emission from the solid state body and the light fields used.”

The most important field of application is light-field-driven electronics: With the proposed two-color method, the laser light can be modulated in such a way that an exactly defined sequence of electron pulses and thus of electrical signals could be generated. Dienstbier: “In the foreseeable future, it will be possible to integrate the components of our test setup — light sources, metal tip, electron detector — into a microchip.” Complex circuits with bandwidths up to the petahertz range are then conceivable — that would be almost a million times faster than current electronics.

 

Ultimate Quantum Sensitivity in the Estimation of the Delay between two Interfering Photons through Frequency-Resolving Sampling

by Danilo Triggiani, Giorgos Psaroudis, Vincenzo Tamma in Physical Review Applied

A team of researchers has demonstrated the ultimate sensitivity allowed by quantum physics in measuring the time delay between two photons.

By measuring their interference at a beam-splitter through frequency-resolving sampling measurements, the team has shown that unprecedented precision can be reached within current technology with an error in the estimation that can be further decreased by decreasing the photonic temporal bandwidth. This breakthrough has significant implications for a range of applications, including more feasible imaging of nanostructures, including biological samples, and nanomaterial surfaces, as well as quantum enhanced estimation based on frequency-resolved boson sampling in optical networks. The research was conducted by a team of scientists at the University of Portsmouth, led by Dr Vincenzo Tamma, Director of the University’s Quantum Science and Technology Hub.

Dr Tamma said: “Our technique exploits the quantum interference occurring when two single photons impinging on the two faces of a beam-splitter are indistinguishable when measured at the beam splitter output channels. If, before impinging on the beam splitter, one photon is delayed in time with respect to the other by going through or being reflected by the sample, one can retrieve in real time the value of such a delay and therefore the structure of the sample by probing the quantum interference of the photons at the output of the beam splitter.

Scheme of the interferometric setup.

“We showed that the best precision in the measurement of the time delay is achieved when resolving such two-photon interference with sampling measurements of the two photons in their frequencies. Indeed, this ensures that the two photons remain completely indistinguishable at detectors, irrespective of their delay at any value of their sampled frequencies detected at the output.”

The team proposed the use of a two-photon interferometer to measure the interference of two photons at a beam-splitter. They then introduced a technique based on frequency-resolving sampling measurements to estimate the time delay between the two photons with the best possible precision allowed by nature, and with an increasing sensitivity at the decreasing of the photonic temporal bandwidth.

Dr Tamma added: “Our technique overcomes the limitations of previous two-photon interference techniques not retrieving the information on the photonic frequencies in the measurement process.

“It allows us to employ photons of the shortest duration experimentally possible without affecting the distinguishability of the time-delayed photons at the detectors, and therefore maximising the precision of the delay estimation with a remarkable reduction in the number of required pairs of photons. This allows a relatively fast and efficient characterisation of the given sample paving the way to applications in biology and nanoengineering.”

The applications of this breakthrough research are significant. It has the potential to significantly improve the imaging of nanostructures, including biological samples, and nanomaterial surfaces. Additionally, it could lead to quantum-enhanced estimation based on frequency-resolved boson sampling in optical networks.

 

Verification of the area law of mutual information in a quantum field simulator

by Mohammadamin Tajik, Ivan Kukuljan, Spyros Sotiriadis, Bernhard Rauer, Thomas Schweigler, Federica Cataldini, João Sabino, Frederik Møller, Philipp Schüttelkopf, Si-Cong Ji, Dries Sels, Eugene Demler, Jörg Schmiedmayer in Nature Physics

A team of physicists has illuminated certain properties of quantum systems by observing how their fluctuations spread over time. The research offers an intricate understanding of a complex phenomenon that is foundational to quantum computing — a method that can perform certain calculations significantly more efficiently than conventional computing.

“In an era of quantum computing it’s vital to generate a precise characterization of the systems we are building,” explains Dries Sels, an assistant professor in New York University’s Department of Physics and an author of the paper. “This work reconstructs the full state of a quantum liquid, consistent with the predictions of a quantum field theory — similar to those that describe the fundamental particles in our universe.”

Sels adds that the breakthrough offers promise for technological advancement. “Quantum computing relies on the ability to generate entanglement between different subsystems, and that’s exactly what we can probe with our method,” he notes. “The ability to do such precise characterization could also lead to better quantum sensors — another application area of quantum technologies.”

Additional results for area law of MI and volume law of vN entropy.

The research team, which included scientists from Vienna University of Technology, ETH Zurich, Free University of Berlin, and the Max-Planck Institute of Quantum Optics, performed a tomography of a quantum system — the reconstruction of a specific quantum state with the aim of seeking experimental evidence of a theory.

The studied quantum system consisted of ultracold atoms — slow-moving atoms that make the movement easier to analyze because of their near-zero temperature — trapped on an atom chip. In their work, the scientists created two “copies” of this quantum system — cigar-shaped clouds of atoms that evolve over time without influencing each other. At different stages of this process, the team performed a series of experiments that revealed the two copies’ correlations.

“By constructing an entire history of these correlations, we can infer what is the initial quantum state of the system and extract its properties,” explains Sels. “Initially, we have a very strongly coupled quantum liquid, which we split into two so that it evolves as two independent liquids, and then we recombine it to reveal the ripples that are in the liquid.

“It’s like watching the ripples in a pond after throwing a rock in it and inferring the properties of the rock, such as its size, shape, and weight.”

 

Subscribe to Paradigm!

Medium. Twitter. Telegram. Telegram Chat. Reddit. LinkedIn.

 

Main sources

Research articles

Advanced Quantum Technologies

PRX Quantum

Science Daily

SciTechDaily

Quantum News

Nature

Tags

Quantum
quantum dots
nanomagnet
quantum detection technology
quantum systems