Neural network quantum states

Neural Network Quantum States (NQS or NNQS) is a general class of variational quantum states parameterized in terms of an artificial neural network. It was first introduced in 2017 by the physicists Giuseppe Carleo and Matthias Troyer to approximate wave functions of many-body quantum systems.

Given a many-body quantum state $$ |\Psi\rangle $$ comprising $$ N $$ degrees of freedom and a choice of associated quantum numbers $$ s_1 \ldots s_N $$, then an NQS parameterizes the wave-function amplitudes

$$ \langle s_1 \ldots s_N |\Psi; W \rangle = F(s_1 \ldots s_N; W), $$

where $$ F(s_1 \ldots s_N; W) $$ is an artificial neural network of parameters (weights) $$ W $$, $$ N $$ input variables ($$ s_1 \ldots s_N $$) and one complex-valued output corresponding to the wave-function amplitude.

This variational form is used in conjunction with specific stochastic learning approaches to approximate quantum states of interest.

Learning the Ground-State Wave Function
One common application of NQS is to find an approximate representation of the ground state wave function of a given Hamiltonian $$ \hat{H} $$. The learning procedure in this case consists in finding the best neural-network weights that minimize the variational energy

$$ E(W) = \langle \Psi; W | \hat{H}|\Psi; W \rangle. $$ Since, for a general artificial neural network, computing the expectation value is an exponentially costly operation in $$ N $$, stochastic techniques based, for example, on the Monte Carlo method are used to estimate $$ E(W) $$, analogously to what is done in Variational Monte Carlo, see for example for a review. More specifically, a set of $$ M $$ samples $$ S^{(1)}, S^{(2)} \ldots S^{(M)} $$, with $$ S^{(i)}=s^{(i)}_1\ldots s^{(i)}_N $$, is generated such that they are uniformly distributed according to the Born probability density $$ P(S) \propto |F(s_1 \ldots s_N; W)|^2 $$. Then it can be shown that the sample mean of the so-called "local energy" $$ E_{\mathrm{loc}}(S) = \langle S|\hat{H}|\Psi\rangle/ \langle S|\Psi\rangle $$ is a statistical estimate of the quantum expectation value $$ E(W) $$, i.e.

$$ E(W) \simeq \frac{1}{M} \sum_i^M E_{\mathrm{loc}}(S^{(i)}). $$

Similarly, it can be shown that the gradient of the energy with respect to the network weights $$ W $$ is also approximated by a sample mean

$$ \frac{\partial E(W)}{\partial W_k} \simeq \frac{1}{M} \sum_i^M (E_{\mathrm{loc}}(S^{(i)}) - E(W)) O^\star_k(S^{(i)}), $$

where $$ O(S^{(i)})= \frac{\partial \log F(S^{(i)};W)}{\partial W_k}$$ and can be efficiently computed, in deep networks through backpropagation.

The stochastic approximation of the gradients is then used to minimize the energy $$ E(W) $$ typically using a stochastic gradient descent approach. When the neural-network parameters are updated at each step of the learning procedure, a new set of samples $$ S^{(i)} $$ is generated, in an iterative procedure similar to what done in unsupervised learning.

Connection with Tensor Networks
Neural-Network representations of quantum wave functions share some similarities with variational quantum states based on tensor networks. For example, connections with matrix product states have been established. These studies have shown that NQS support volume law scaling for the entropy of entanglement. In general, given a NQS with fully-connected weights, it corresponds, in the worse case, to a matrix product state of exponentially large bond dimension in $$ N $$.