This paper presents a novel quadratic binary optimization (QBO) model for quantified neural network training using quantum computing. Spline interpolation allows the use of arbitrary activation and loss functions. To address the challenges of nonlinearity and the multilayered structure of neural networks, we introduce the forward interval propagation (FIP) technique, which discretizes the activation function into linear subintervals. This method maintains the universal approximation properties of neural networks while enabling the optimization of complex nonlinear functions using quantum computers, broadening its applicability in artificial intelligence. From an optimization perspective, we derive the sample complexity of the empirical risk minimization problem, providing theoretical upper bounds on the approximation error and the required number of Ising spins. A key challenge in solving large-scale quadratic constrained binary optimization (QCBO) models is the presence of numerous constraints. To address this, we directly solve the QCBO problem using the quantum conditional gradient descent (QCGD) algorithm. We prove the convergence of QCGD under a quantum oracle with random and bounded variance objective function values and under the constraint of limited precision of the coefficient matrix, and provide an upper bound on the time-to-solution of the QCBO solution process. We also propose a training algorithm that incorporates single-sample bit-scale optimization.