In this paper, we propose Coordination Transformers (CooT), a novel framework for effective cooperation among multiple artificial agents in dynamic and uncertain environments. Unlike existing self-matching learning or group-based methods, CooT is a context-based cooperation framework that rapidly adapts to new partners by leveraging recent interaction history. Using interaction paths of diverse agent pairs as training data, it rapidly learns effective cooperation strategies without explicit supervision or fine-tuning. Evaluation results on the overcooked benchmark show that CooT significantly outperforms existing methods in cooperation tasks with previously unseen partners. Human evaluations also confirm CooT as the most effective cooperation partner, and extensive ablation studies highlight the robustness, flexibility, and context-sensitivity of CooT in multi-agent scenarios.