This paper presents a GPU-based exhaustive search method for global minima of nonlinear functions under simple variable constraints. By combining interval analysis and GPU performance and architecture, iteratively removes regions where global minima cannot exist, leaving a finite set of regions where global minima must exist. Due to the rigor of interval analysis, it is guaranteed to contain global minima even in the presence of round-off errors. For efficiency, a novel GPU-based single-program, single-data parallel programming style is adopted to avoid GPU performance bottlenecks, and variable rotation techniques are integrated to reduce computational cost when minimizing large-scale nonlinear functions. The effectiveness of the method is verified by minimizing ten multimodal benchmark test functions (up to 10,000 dimensions), including the Ackley function, Griewank function, Levy function, and Rastrigin function. We successfully contain global minima for benchmark functions with 80 or more dimensions, which have not been reported in previous studies.