January 8, 2025

Foundations of Algorithmic Thermodynamics

Abstract

Gács' coarse-grained algorithmic entropy leverages universal computation to quantify the information content of any given physical state. Unlike the Boltzmann and Gibbs-Shannon entropies, it requires no prior commitment to macrovariables or probabilistic ensembles, rendering it applicable to settings arbitrarily far from equilibrium. For measure-preserving dynamical systems equipped with a Markovian coarse-graining, we prove a number of fluctuation inequalities. These include algorithmic versions of Jarzynski's equality, Landauer's principle, and the second law of thermodynamics. In general, the algorithmic entropy determines a system's actual capacity to do work from an individual state, whereas the Gibbs-Shannon entropy only gives the mean capacity to do work from a state ensemble that is known a priori.

Authors

Marcus Hutter, Aram Ebtekar*

Venue

Physical Review E