In this study, we develop a dimensionality reduction technique to accelerate diffusion model inference in the context of synthetic data generation. The idea is to integrate compressed sensing into the diffusion model (CSDM). First, we compress data into a latent space and train a diffusion model in the latent space. Next, we decode the generated samples from the latent space back into the original space using a compressed sensing algorithm. The goal is to improve the efficiency of model training and inference. By making specific data sparsity assumptions, the proposed approach achieves proven faster convergence by combining diffusion model inference with sparse recovery. It also provides insights into the optimal choice of latent space dimensionality. To demonstrate the effectiveness of this approach, we conduct numerical experiments on various datasets, including handwritten digits, medical and climate images, and financial time series data for stress testing.