This paper proposes Sliding Window Adversarial Training (SWAT), a novel method for Progressive Domain Adaptation (GDA) to address domain shift, a major cause of poor machine learning performance. SWAT forms an adversarial stream connecting the feature spaces of the source and target domains and uses a sliding window paradigm to gradually reduce small differences between adjacent intermediate domains. Domain shift is explicitly reduced when the window reaches the end of the stream, i.e., the target domain. Extensive experiments on six GDA benchmarks demonstrate the effectiveness of SWAT, demonstrating a 6.1% performance improvement on Rotated MNIST and a 4.1% improvement on CIFAR-100C.