This post is on the heels of a paper1 we recently published (with Éva Czabarka) on the degeneracy problem of the Maximum Entropy Method. The post is fairly long as it also provides a tutorial on the topic.
When we are trying to understand a system or predict its behavior, we usually do that based on limited information. In everyday situations we make our decisions and choices using some sort of “inner” probability distribution shaped by past experience. However, human decision-making tends to be biased; just this wiki page enlists over 150 such biases. Naturally, this raises the question whether there is a way to make in-silico, unbiased quantitative inferences based on limited information.
The father of a principled and quantitative approach to unbiased inference making is physicist Edwin T. Jaynes. Starting off his career under the tutelage of none other than Eugene (Jenő) Wigner, he brought significant contributions in several areas of physics, most notably in statistical mechanics and its applications. Standing on the shoulders of Laplace, Bayes and Shannon, Jaynes started a revolution in statistical inference with his two seminal papers2,3 from 1957. He based his inference method on the notion of statistical ensembles, much in the same way that statistical mechanics uses the ensembles of microstates to describe macroscopic properties of matter.
In his two celebrated papers Jaynes has re-derived many results from both equilibrium statistical mechanics and the time-dependent density matrix formalism of quantum physics using only the Maximum Entropy Principle. The plot below shows what it means to write papers for the ages: his two papers from 1957 have been accumulating citations exponentially, with citations doubling roughly every ten years.