This paper develops a goal for controlled generation within the Variational Flow Matching (VFM) framework. We treat flow matching as a variational inference problem and show that controlled generation can be implemented as a Bayesian inference problem, enabling (1) end-to-end training of conditional generative models or (2) post-conditional control of unconditional models without retraining. Furthermore, we establish conditions for equivariant generation and provide an equivariant formulation of VFM suitable for molecule generation, ensuring invariance to rotation, translation, and permutation. Our results demonstrate excellent performance in both uncontrolled and controlled molecule generation, outperforming state-of-the-art models in both end-to-end training and Bayesian inference settings. By strengthening the connection between flow-based generative modeling and Bayesian inference, this study provides a scalable and principled framework for constraint-based and symmetry-aware generation.