This paper addresses the fact that the ensemble average of a molecule's physical properties is closely related to its molecular structure distribution, and sampling this distribution is a fundamental challenge in physics and chemistry. Conventional methods, such as molecular dynamics (MD) simulations and Markov Chain Monte Carlo (MCMC) sampling, can be time-consuming and expensive. To overcome the limitations of diffusion models, which have emerged as efficient alternatives by learning the distribution of training data, we propose a potential score matching (PSM) method that utilizes potential energy gradients to guide generative models. PSM does not require an exact energy function and can eliminate bias in sample distributions even when trained with limited and biased data. We demonstrate that PSM outperforms existing state-of-the-art (SOTA) models on the commonly used toy model, the Lennard-Jones (LJ) potential, and on the high-dimensional MD17 and MD22 datasets. We also demonstrate that the molecular distribution generated by PSM closely approximates the Boltzmann distribution compared to conventional diffusion models.