In this paper, we propose to use Gaussian Mixture Model (GMM) as a back-transition operator (kernel) within the framework of Denoising Diffusion Implicit Model (DDIM), which is one of the widely used methods for accelerated sampling from pre-trained models in Denoising Diffusion Probabilistic Model (DDPM). Specifically, we constrain the parameters of GMM to match the first and second central moments of the DDPM forward marginal probabilities. Experimental results show that moment matching is sufficient to obtain samples with equivalent or better quality than the conventional DDIM using Gaussian kernel. Experimental results are presented using unconditional models trained on CelebAHQ and FFHQ and conditional models trained on ImageNet dataset. The results suggest that using GMM kernel with a small number of sampling steps significantly improves the quality of generated samples as measured by FID and IS metrics. For example, when using 10 sampling steps on ImageNet 256x256, we achieved FID 6.94 and IS 207.85 using the GMM kernel, while 10.15 and 196.73, respectively, when using the Gaussian kernel.