Entropy of network ensembles

A set of networks that satisfies given structural characteristics can be treated as a network ensemble. Brought up by Ginestra Bianconi in 2007, the entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble.

The entropy is the logarithm of the number of graphs. Entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network.

Employing approaches from statistical mechanics, the complexity, uncertainty, and randomness of networks can be described by network ensembles with different types of constraints.

Gibbs and Shannon entropy
By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as: $$Z = \sum_{\mathbf{a}} \delta \left[\vec{F}(\mathbf{a})-\vec{C}\right] \exp\left(\sum_{ij}h_{ij}\Theta(a_{ij}) + r_{ij}a_{ij}\right)$$

where $$\vec{F}(\mathbf{a})=\vec{C}$$ is the constraint, and $$a_{ij}$$ ($$a_{ij} \geq {0}$$) are the elements in the adjacency matrix, $$a_{ij} > 0$$ if and only if there is a link between node i and node j. $$\Theta(a_{ij})$$ is a step function with $$\Theta(a_{ij}) = 1$$ if $$x > 0$$, and $$\Theta(a_{ij}) = 0$$ if $$x = 0$$. The auxiliary fields $$h_{ij}$$ and $$r_{ij}$$ have been introduced as analogy to the bath in classical mechanics.

For simple undirected networks, the partition function can be simplified as

$$Z = \sum_{\{a_{ij}\}} \prod_{k}\delta(\textrm{constraint}_{k}(\{a_{ij}\})) \exp\left(\sum_{i<j}\sum_{\alpha}h_{ij}(\alpha)\delta_{a_{ij},\alpha}\right)$$

where $$a_{ij}\in\alpha$$, $$\alpha$$ is the index of the weight, and for a simple network $$\alpha=\{0,1\}$$.

Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.

For a microcanonical ensemble, the Gibbs entropy $$\Sigma$$ is defined by:

$$\begin{align} \Sigma &= \frac{1}{N} \log\mathcal{N} \\ &= \frac{1}{N} \log Z|_{h_{ij}(\alpha)=0\forall(i,j,\alpha)} \end{align}$$

where $$\mathcal{N}$$ indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.

The probability of having a link between nodes i and j, with weight $$\alpha$$ is given by:

$$\pi_{ij}(\alpha) = \frac{\partial \log Z}{\partial{h_{ij}}(\alpha)}$$

For a canonical ensemble, the entropy is presented in the form of a Shannon entropy:

$${S}=-\sum_{i<j}\sum_{\alpha} \pi_{ij}(\alpha) \log \pi_{ij}(\alpha)$$

Relation between Gibbs and Shannon entropy
Network ensemble $$G(N,L)$$ with given number of nodes $$N$$ and links $$L$$, and its conjugate-canonical ensemble $$G(N,p)$$ are characterized as microcanonical and canonical ensembles and they have Gibbs entropy $$\Sigma$$ and the Shannon entropy S, respectively. The Gibbs entropy in the $$G(N,p)$$ ensemble is given by:

$${N}\Sigma = \log\left(\begin{matrix}\cfrac{N(N-1)}{2}\\L\end{matrix}\right)$$

For $$G(N,p)$$ ensemble,

$${p}_{ij} = p = \cfrac{2L}{N(N-1)}$$

Inserting $$p_{ij}$$ into the Shannon entropy:

$$\Sigma = S/N+\cfrac{1}{2N}\left[\log\left( \cfrac{N(N-1)}{2L} \right) - \log\left(\cfrac{N(N-1)}{2}-L\right)\right]$$

The relation indicates that the Gibbs entropy $$\Sigma$$ and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit $$N\to\infty$$.