This is a page that curates AI-related papers published worldwide. All content here is summarized using Google Gemini and operated on a non-profit basis. Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.
Cross-Domain Few-Shot Learning with Coalescent Projections and Latent Space Reservation
Created by
Haebom
Author
Naeem Paeedeh, Mahardhika Pratama, Wolfgang Mayer, Jimmy Cao, Ryszard Kowlczyk
Outline
This paper presents a study on cross-domain few-shot learning (CD-FSL) that achieves outperforming state-of-the-art SOTA methods by combining a pre-trained model with DINO and a prototype classifier. To address the overfitting problem caused by many parameter updates of the transformer, which is Limitations in existing methods, we propose Coalescent Projection (CP), an effective successor to soft prompts. In addition, we present a novel pseudo-class generation method combined with Self-Supervised Transformations (SSTs), which prepare the network to process unknown samples from other domains using only the primary domain. We demonstrate the effectiveness of the proposed method through comprehensive experiments on extreme domain-shifting scenarios of the BSCD-FSL benchmark. The source code is available at https://github.com/Naeem-Paeedeh/CPLSR .
Takeaways: Demonstrate the superiority of CD-FSL approach combining DINO and circular classifier. Present a novel cohesive projection (CP) technique to overcome the limitations of soft prompts. Present effective domain adaptation strategy through pseudo-class generation using only base domain and self-supervised learning transforms (SSTs). Verified excellent performance on BSCD-FSL benchmark.
•
Limitations: Further research is needed on the generalization performance of the proposed method. Robustness evaluation for various domain movement scenarios is needed. Comparative analysis with other CD-FSL methods needs to be performed in more depth.