### Entropy

The entropy of a dynamical system describes the complexity of that system due to the uncertainty and information needed to characterize its evolution through time. For the case of a Markov chain (see "The model") associated to the probability vector \(x=\left(x_{1},\cdots,x_{n}\right)\) and the transition matrix between states \(j\) and \(i\), \(P=\left(p_{ij}\right)_{1\leq i,j\leq n}\), the entropy is given by \[E=-\sum_{ij=1}^{n}x_{i}p_{ij}log_{2}\left(p_{ij}\right).\]

To have an ideia about how the entropy changes we may use the applet below, where you can select the number of pages and the links between them. How does the entropy change with the number of pages and the parameter \(p\)?

**Specific instructions:**For each configuration, you will find its entropy on the right. For each choice for the number of pages and for the parameter \(p\), the entropy of the system is between given upper and lower bounds. Each time these bounds are reached, the applet signals it. For instance, if we choose a configuration with no *links* at all, then the entropy reaches its highest possible value. Is this the only configuration when this upper bound is reached? And, what should be the configuration to reach the least possible value for the entropy?

To know more on this, please go to this page.