This paper presents fundamental thermodynamic constraints faced by information processing systems coordinating multiple agents and goals. Coordination foci with maximum utility are subject to greater selection pressure on the likelihood of discovery among agents than on accuracy. The information-theoretic minimum description length of the coordination protocol for precision $\varepsilon$ is extended to $L(P)\geq NK\log_2 K+N^2d^2\log (1/\varepsilon)$, which leads to progressive simplification. Coordination dynamics alter the environment itself and shift optimization across hierarchical levels. Deviation from the original foci necessitates recalibration, generating persistent metastable states and hysteresis until a significant environmental change triggers a phase transition via spontaneous symmetry breaking. We define a coordination temperature to predict critical phenomena, estimate the cost of coordination operations, and identify measurable features across systems ranging from neural networks to restaurant bills to bureaucracies. We also extend Arrow's theorem on the impossibility of consistent preference sets, finding that preferences are recursively bound whenever they are combined. This can explain the alignment violations in large-scale language models trained using infinite loops of multi-objective gradient descent and reinforcement learning with human feedback. This framework is called thermodynamic tuning theory (TCT), and it shows that tuning requires rapid information loss.