Model merging is a technique that combines multiple expert models to create a single model capable of performing multiple tasks. However, as the number of expert models increases, the benefits of merging diminish, and overall performance improvements tend to diminish. This study explains and analyzes this from a task arithmetic perspective. Existing merging methods have shown that the rank of the relevant task vector space collapses as the merging process progresses to more expert models. To mitigate this problem, we propose Subspace Boosting, which operates on the singular value decomposition task vector space and maintains the rank of the task vector. Subspace Boosting improves merging efficiency by more than 10% for up to 20 expert models when evaluated on visual and language benchmarks. Furthermore, we present a new, interpretable perspective on model merging by using Higher-Order Generalized Singular Value Decomposition (HGSD) to quantify task similarity.