To overcome the limitations of existing model merging techniques, this paper proposes a novel evolutionary algorithm, M2N2, which leverages the concept of natural niches. M2N2 explores diverse model combinations and improves performance through three key features: dynamically adjusting model parameter merging boundaries, maintaining diversity, and selecting promising model pairs. Experimental results show that M2N2 achieves comparable performance to CMA-ES more efficiently by training an MNIST classifier model from scratch, and also demonstrates state-of-the-art performance in merging language and image generation models. Notably, it demonstrates robustness and diversity, preserving important model features that are not explicitly optimized by the fitness function.