In this paper, we propose an analytic subspace routing (ASR) technique to solve the continuous learning (CL) problem of large-scale language models (LLMs). Existing continuous learning techniques have the problem of reusing previous data, incurring additional computational costs, or using single-parameter efficiency modules, which limits the absorption of new knowledge. ASR separates learning within the subspace of deep layer features for each task, thereby eliminating knowledge interference between tasks. In addition, it efficiently utilizes the knowledge learned in various subspaces through an analytic routing mechanism. It learns a multi-task router model using the recursive least squares method, allowing the router to dynamically adapt to incoming data without accessing past data, assigning the current task to an appropriate subspace, and guaranteeing the non-forgetting property for previously learned tasks. Experimental results show that ASR effectively overcomes the limitations of existing methods by seamlessly integrating new information while maintaining previous knowledge almost perfectly.