NeRF-based segmentation methods focus on object semantics and rely solely on RGB data, limiting their ability to account for unique material properties. In this paper, we present the UnMix-NeRF framework, which integrates spectral unmixing into NeRF to simultaneously perform hyperspectral novel view synthesis and unsupervised material segmentation. Spectral reflectance is modeled via diffuse and specular components, while a learned global end-member dictionary represents pure material signatures, and point-wise abundances capture their distribution. Unsupervised material clustering is performed using spectral signature predictions along learned end-members. Furthermore, by modifying the learned end-member dictionary, we enable flexible material-based appearance manipulation for scene editing. Extensive experiments demonstrate superior spectral reconstruction and material segmentation performance compared to existing methods.