This paper proposes using a Gaussian Mixture Model (GMM) as a back-transition operator (kernel) within the widely used Denoising Diffusion Implicit Models (DDIM) framework for accelerated sampling in Denoising Diffusion Probabilistic Models (DDPM). Specifically, we constrain the parameters of the GMM to match the first and second central moments of the DDPM forward marginal probabilities. We demonstrate that moment matching is sufficient to obtain samples of equivalent or better quality compared to conventional DDIMs using a Gaussian kernel. We present experimental results for text-to-image generation using Stable Diffusion v2.1 on the COYO700M dataset, with unconditional models trained on CelebAHQ and FFHQ, conditional models trained on ImageNet, and the experimental results suggest that using the GMM kernel with a small number of sampling steps significantly improves the quality of generated samples, as measured by the FID and IS metrics. For example, using 10 sampling steps on ImageNet 256x256, the GMM kernel achieves an FID of 6.94 and an IS of 207.85, while the Gaussian kernel achieves an FID of 10.15 and an IS of 196.73. Furthermore, we derive a novel SDE sampler for the commutative flow matching model and conduct experiments using the proposed approach. We observe improvements in both the one-commutative flow and two-commutative flow models.