This paper demonstrates that forward-propagation neural networks using the ReLU activation function can generalize to well-defined, low-complexity data. Given iid data generated using a simple programming language, a minimum skill length (MDL) forward-propagation neural network that interpolates the data generalizes with high probability. The paper defines this simple programming language and the concept of skill length for such a neural network. It provides several examples of basic computational tasks, such as primality detection. For primality detection, the theorem states the following: Consider an iid sample of n numbers drawn uniformly at random from 1 to N. For each number xi, if xi is prime, yi = 1; otherwise, yi = 0. Then, an interpolating MDL network correctly answers whether a newly drawn number from 1 to N is prime or not with an error probability of 1-O(ln N)/n). Note that the network is not designed to detect primes; minimum skill learning discovers networks that do. Extensions to noisy data are also discussed, suggesting that MDL neural network interpolators may exhibit mild overfitting.