Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Sparsity Outperforms Low-Rank Projections in Few-Shot Adaptation

Created by
  • Haebom

Author

Nairouz Mrabah, Nicolas Richet, Ismail Ben Ayed, Eric Granger

Outline

This paper proposes a novel Sparse Optimization (SO) framework to address overfitting and computational constraints encountered in the adaptation of Vision-Language Models (VLMs) to new domains. Unlike existing low-dimensional reparameterization methods, SO leverages the high-dimensional sparsity of parameters to dynamically update only a small number of parameters. Specifically, it introduces two paradigms: "local sparsity and global density" and "local randomness and global importance" to mitigate overfitting and ensure stable adaptation in low-data environments. Experimental results on 11 diverse datasets demonstrate that SO achieves state-of-the-art few-shot adaptation performance while reducing memory overhead.

Takeaways, Limitations

Takeaways:
An effective SO framework for improving domain adaptation performance of VLM in low-data environments.
Efficient memory usage and reduced computational costs compared to existing low-dimensional methods.
Presentation of a new paradigm of 'local sparsity and global density' and 'local randomness and global importance'.
Achieving state-of-the-art performance on diverse datasets.
Limitations:
Lack of detailed description of hyperparameter tuning of the proposed SO framework.
Generalization performance verification is needed for various VLM architectures.
Lack of applicability and performance evaluation for large-scale datasets.
👍