Essential Reading
GOALS:
1. Understand the relations between entropy production, free energy changes and work done in the non-equilibrium regime of Markov processes and deterministic Hamiltonian dynamics. ie: understand all the different Fluctuation Theorems.
2. Connect these to machine learning of time series via objectives like Minimum Conditional Entropy Production. Understand the relation of new work-based objectives to classical unsupervised learning objectives like maximum likelihood.
PAPERS:
These are the most important papers from Crooks. His thesis goes through the theorems for Markov chains thoroughly. You will find it on his webpage www.threeplusone.com
Crooks G.E. 1999. Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences
Crooks G.E. 1999. Path ensemble averages in systems driven far from equilibrium.
Crooks G.E. 2007. Beyond Boltzmann-Gibbs statistics: Maximum entropy hyper-ensembles out of equilibrium
The following 3 papers are very insightful and contain some of the most advanced ideas
Gomez-Martin A., Parrondo J.M.R. \& Van den Broeck. 2008. The "footprints" of irreversibilty
Kawai R., Parrondo J.M.R. \& Van den Broeck. 2007. Dissipation: the phase-space perspective
Parrondo J.M.R., Van den Broeck \& Kawai R. 2009. Entropy production and the arrow of time
Mae's papers are really difficult and confusing but they seem to be the most advanced work.
This one contains the path ensemble maths for markov chains.
Maes C. \& Netocny 2002. Time-reversal and entropy
This one you should read section 4 on Entropy Production and maybe sections 1 and 2 for background
Maes C. 2001. Statistical mechanics of entropy production: Gibbsian hypothesis and local fluctuations
Kurchan J. 2005. Non-equilibrium work relations
This is a useful short review of foundations. Not deep, but a good starting point if you're confused.
Karevski D. 2007. Foundations of statistical mechanics: in and out of equilibrium