In this paper, we propose a novel method to enable GNNs to solve SAT problems by applying a technique to mixed-integer linear programming (MILP). After mapping the k-CNF formula to the MILP problem, we encode it into a weighted bipartite graph and feed it as input to the GNN for training and testing. Theoretically, we (i) establish the permutation and equivalence invariance results, which produce stable outputs under rearrangements of clauses and variables, (ii) verify the theoretical bound that standard GNNs cannot always distinguish between satisfiable and unsatisfiable instances for a class of formulas called foldable formulas, and (iii) prove a universal approximation theorem that random node initialization (RNI) allows GNNs to approximate SAT solutions to arbitrary precision on a finite dataset (i.e., GNNs are approximately sound and complete on the dataset). We also show that the same approximation guarantees can be achieved without RNI for non-foldable formulas. Finally, we present experimental evaluations that demonstrate promising results despite the simplicity of the neural network architecture.