This paper presents a novel approach to the problem of clustering time series. Specifically, we propose a method for partitioning a given set of time series into subsets using a given number of subsets and learning a linear dynamical system (LDS) model for each subset. The goal is to minimize the maximum error across all models. We present a globally convergent method and the EM heuristic, and the computational results show promising results. Key features include the elimination of the need for a predefined hidden state dimension and the provision of guidelines for determining regularization during system identification.