LrcSSM is a nonlinear recurrent model that processes long sequences at the same speed as conventional linear state-space hierarchies. By restricting the Jacobian matrix to a diagonal matrix, it achieves $\mathcal{O}(TD)$ time and memory complexity and $\mathcal{O}(\log T)$ sequential depth for input sequence length T and state dimension D by computing the entire sequence in parallel. Furthermore, unlike other input variation systems such as Liquid-S4 or Mamba, it formally guarantees gradient stability. Importantly, the diagonal Jacobian structure does not degrade performance compared to models with the original dense Jacobian, and this method can be generalized to other nonlinear recurrent models, demonstrating its broad applicability. For long-term prediction tasks, LrcSSM outperforms Transformer, LRU, S5, and Mamba.