[공지사항]을 빙자한 안부와 근황 
Show more

Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

EMargin: Revisiting Contrastive Learning with Margin-Based Separation

Created by
  • Haebom

Author

Abdul-Kazeem Shamba, Kerstin Bach, Gavin Taylor

Outline

This paper investigates the effectiveness of introducing adaptive margin (eMargin) in a contrastive learning framework for time-series data representation learning. We explore whether adding an adaptive margin, which is adjusted based on a predefined similarity threshold, to the existing InfoNCE loss function can improve the separation between similar but different time steps and enhance the performance of downstream tasks. We evaluate the clustering and classification performance on three benchmark datasets and find that achieving high scores on unsupervised clustering metrics does not necessarily mean that the learned embeddings are meaningful or effective for downstream tasks. Specifically, the method adding eMargin to InfoNCE outperforms state-of-the-art baseline models on unsupervised clustering metrics, but struggles to achieve competitive results on downstream classification tasks via linear search. The source code is publicly available.

Takeaways, Limitations

Takeaways: Contrastive learning with adaptive margins (eMargin) is shown to perform well on unsupervised clustering tasks.
Limitations: Shows that high performance on unsupervised clustering metrics does not directly correlate with performance on downstream tasks (e.g., classification). Models that added eMargin did not show competitive performance on downstream classification tasks. There is a discrepancy between unsupervised learning performance and supervised learning performance.
👍