Equilibrium Matching (EqM) is a generative modeling framework built from the perspective of equilibrium dynamics. It abandons the non-equilibrium, time-conditional dynamics of traditional diffusion- and flow-based generative models and instead learns the equilibrium gradient of an implicit energy landscape. This approach employs an optimization-based sampling process at inference time, using gradient descent from the learned landscape with a tunable step size, an adaptive optimizer, and adaptive computation. EqM empirically outperforms diffusion/flow models in generative performance, achieving a FID of 1.90 on ImageNet 256$\times$256. Furthermore, EqM is theoretically justified by learning and sampling from a data manifold. Beyond generative tasks, EqM is a flexible framework that naturally addresses tasks including partially noisy image denoising, OOD detection, and image composition. By replacing time-conditional velocity with a unified equilibrium landscape, EqM provides a stronger link between flow- and energy-based models, and presents a simple path to optimization-based inference.