RESUMEN
There is great interest in using near-term quantum computers to simulate and study foundational problems in quantum mechanics and quantum information science, such as the scrambling measured by an out-of-time-ordered correlator (OTOC). Here we use an IBM Q processor, quantum error mitigation, and weaved Trotter simulation to study high-resolution operator spreading in a four-spin Ising model as a function of space, time, and integrability. Reaching four spins while retaining high circuit fidelity is made possible by the use of a physically motivated fixed-node variant of the OTOC, allowing scrambling to be estimated without overhead. We find clear signatures of a ballistic operator spreading in a chaotic regime, as well as operator localization in an integrable regime. The techniques developed and demonstrated here open up the possibility of using cloud-based quantum computers to study and visualize scrambling phenomena, as well as quantum information dynamics more generally.
RESUMEN
Scrambling processes, which rapidly spread entanglement through many-body quantum systems, are difficult to investigate using standard techniques, but are relevant to quantum chaos and thermalization. In this Letter, we ask if quantum machine learning (QML) could be used to investigate such processes. We prove a no-go theorem for learning an unknown scrambling process with QML, showing that it is highly probable for any variational Ansatz to have a barren plateau landscape, i.e., cost gradients that vanish exponentially in the system size. This implies that the required resources scale exponentially even when strategies to avoid such scaling (e.g., from Ansatz-based barren plateaus or no-free-lunch theorems) are employed. Furthermore, we numerically and analytically extend our results to approximate scramblers. Hence, our work places generic limits on the learnability of unitaries when lacking prior information.
RESUMEN
Although quantum computers are predicted to have many commercial applications, less attention has been given to their potential for resolving foundational issues in quantum mechanics. Here we focus on quantum computers' utility for the Consistent Histories formalism, which has previously been employed to study quantum cosmology, quantum paradoxes, and the quantum-to-classical transition. We present a variational hybrid quantum-classical algorithm for finding consistent histories, which should revitalize interest in this formalism by allowing classically impossible calculations to be performed. In our algorithm, the quantum computer evaluates the decoherence functional (with exponential speedup in both the number of qubits and the number of times in the history) and a classical optimizer adjusts the history parameters to improve consistency. We implement our algorithm on a cloud quantum computer to find consistent histories for a spin in a magnetic field and on a simulator to observe the emergence of classicality for a chiral molecule.
RESUMEN
An environment interacting with a system acquires information about it, e.g. about its location. The resulting decoherence is thought to be responsible for the emergence of the classical realm of our Universe out of the quantum substrate. However, this view of the emergence of the classical is sometimes dismissed as a consequence of insufficient isolation and, hence, as non-fundamental. In contrast to many other systems, a black hole can never be isolated from its Hawking radiation which carries information about its location, making this lack of isolation fundamental. Here we consider the decoherence of a "black hole Schrödinger cat"-a non-local superposition of a Schwarzschild black hole in two distinct locations-due to its Hawking radiation. The resulting decoherence rate turns out to be given by a surprisingly simple equation. Moreover, and in contrast to known cases of decoherence, this rate does not involve Planck's constant h.