The General Physics Transformer (GPhyT) is a model that demonstrates the potential of foundational models in physics by learning from 1.8 TB of diverse simulation data. GPhyT can simulate fluid-solid interactions, shock waves, thermal convection, multiphase dynamics, and more with a single model, and learns to infer the governing equations in context without needing to know the specific equations. This model demonstrates outstanding performance across multiple physics domains, outperforming specialized architectures by up to 29x. Furthermore, it achieves zero-shot generalization to entirely new physical systems through contextual learning and provides reliable long-term predictions through 50-step rollout.