This paper proposes a generative, end-to-end solver for black-box combinatorial optimization on NP problems. Inspired by annealing-based algorithms, we treat the black-box objective as an energy function and train a neural network that models the associated Boltzmann distribution. By conditioning on temperature, the neural network captures a continuum of distributions, ranging from nearly uniform at high temperatures to sharply peaking around the global optimum at low temperatures. This allows the network to learn the structure of the energy landscape and facilitate global optimization. When queries are expensive, the temperature-dependent distribution naturally enables data augmentation and improves sample efficiency. When queries are cheap but the problem is difficult, the model effectively "opens" the black box by learning implicit variable interactions. We validate our approach on difficult combinatorial tasks under both limited and unbounded query budgets, demonstrating competitive performance against state-of-the-art black-box optimizers.