Quantum random circuits

Quantum random circuits (QRC) is a concept of incorporating an element of randomness into the local unitary operations and measurements of a quantum circuit. The idea is similar to that of random matrix theory which is to use the QRC to obtain almost exact results of non-integrable, hard-to-solve problems by averaging over an ensemble of outcomes. This incorporation of randomness into the circuits has many possible advantages, some of which are (i) the validation of quantum computers, which is the method that Google used when they claimed quantum supremacy in 2019., and (ii) understanding the universal structure of non-equilibrium and thermalization processes in quantum many-body dynamics.

Quantum Random Circuits
The constituents of some general quantum circuits would be qubits, unitary gates, and measurements. The time evolution of the quantum circuits is discrete in time $$t \in \mathbb{Z}$$, and the states are evolved step by step in time by the application of unitary operators $$U_t \equiv U(t; t-1)$$ under which a pure state evolves according to$$|\psi(t)\rangle=U_t |\psi(t-1)\rangle$$(note that unitary operators can entangle states). Thus, the time evolution from a starting time, say $$t=0$$, to some time $$t$$ would be given by$$U(t; 0) = U_t U_{t-1} \cdots U_3 U_2 U_1$$where for each step, the unitary operator is represented by a tensor product of local unitary gates $$u_{\tau, x}$$ where the $$x$$ index specifies the lattice integer which connects a pair of qubits, and $$\tau$$ is the time step.

Figure 1, shows a time-space diagram of a quantum circuit which shows the local interactions at each time step. In the language of quantum information theory, the number of qubits $$n$$ is the circuit's width, and we define its depth $$d$$ as the number of layers of unitary gates. Hence, for the configuration in Figure 1, $$n = 8$$ and $$d = 4 $$. Another way to interpret the circuit is to look at it as a tensor network in which each purple box is a local gate $$u_{\tau,x}$$ operating on two qubits and the total contraction of qubits indices at the start $$t=0$$ and the end at time $$t$$ on the lattice integers would give the full unitary time evolution $$U(t; 0)$$. Thus, the propagation amplitude from some initial state given by the indices $$\left\{ a_1 a_2 \cdots a_L \right\}$$ to a final state with the indices $$\left\{ b_1 b_2 \cdots b_L \right\}$$ is$$\langle a_1 a_2 \cdots a_L | U(t; 0) | b_1 b_2 \cdots b_L \rangle.$$On the other side, measurements would disentangle the qubits. The used measurements are called projective measurements, defined as observations that leave the degrees of freedom in an eigenstate of the measured operator unchanged. Measurements in quantum mechanics are stochastic by nature, which means that circuits with the same exact structure (qubits and gates) would give different outcomes on different runs, see Figure 2. Though this stochastic nature, should be differentiated from randomness. Let $$\textbf{m} = \left\{ m_1, m_2, \cdots, m_M \right\}$$ be the outcome set of some random measurement, then different measurements on a fixed set of unitary gates would yield distinct $$\textbf{m}$$ records. See the schematic diagram in Figure 2, which sketches a tree diagram with each branch representing a possible outcome of the measurements shown on the circuit. Notice that each measurement results in a different $$\textbf{m}$$, which would be kind of like a random walk. If our system is just a single qubit, then each measurement causes a jump on the Bloch sphere. However, in the many-body case, the situation is complicated due to correlations between different qubits.

Near-term quantum computers validation
As we are currently in the Noisy Intermediate-Scale Quantum (NISQ) era, which means that our current quantum computers are not fault tolerant and are not large enough to reach supremacy, we are looking for tasks that have two features:


 * Classically hard
 * Experimentally feasible in the near-term devices

The needed tasks must be feasible on a quantum computer but classically resource-consuming in terms of, for example, time. For instance, this task could be a system that is solvable in a short time using a classical computer; however, as the system's complexity increases (larger size or dimensions), the computation time would not increase linearly. In that case, a state-of-the-art classical computer would take an unreasonable amount of time (years); meanwhile, a quantum computer is believed to give an exponential reduction in the needed time of computation. Research on this subject to find such a task focused on sampling problems. One of the theoretically compelling methods that would provide such a task is Boson Sampling, as it shows strong complexity-theoretic evidence. However, researchers faced experimental difficulties in achieving the desired results using this sampling method. Another method is random circuit sampling, in which the main task is to sample the output of a random quantum circuit. Results have shown that this approach would be more experimentally feasible with the recent developments of superconducting qubits and has strong complexity-theoretic evidence. In Google's claim of quantum supremacy, they have used their sycamore processor, which took about 200 seconds to sample one instance of a quantum circuit a million times. While on the other hand, a state-of-the-art classical supercomputer would take 10,000 years.

Non-equilibrium and thermalization of quantum many-body dynamics
One of the pressing questions in many-body dynamics is how entanglement spreads with time through for example a quantum quench that is an initially prepared system evolves unitarily in time by a sudden change in the parameters of the initial Hamiltonian. The answer to such a question for a fundamental part of thermalization and would provide a numerical tool to simulate quantum dynamics. Quantum random circuits would serve as a playground to experiment on and understand such processes. Results using QRC methods have shown that there is a universal structure behind noisy entanglement growth