This paper proposes Generalized Orders of Magnitude (GOOMs) to address numerical underflow and overflow problems that arise in fields such as deep learning and finance that require complex computations of real numbers over long periods of time. GOOMs extend the traditional order of magnitude to include floating-point numbers as a special case, enabling stable computations on real numbers with a much larger dynamic range than before. We implement GOOMs with an efficient, custom-built parallel prefix scan to support native execution on parallel hardware such as GPUs. The proposed GOOMs outperform existing methods in three representative experiments: extending real-matrix multiplication, estimating the spectrum of Lyapunov exponents, and capturing long-term dependencies in deep recurrent neural networks with off-diagonal recurrent states, which were previously infeasible. Consequently, the combination of GOOMs and efficient parallel scans provides a scalable and numerically robust alternative to traditional floating-point numbers for high-dynamic-range applications.