WebApr 13, 2024 · (1) In the encoding step, CLCDR aims to model the user and item representations of the source and target domains respectively with a newly proposed contrastive loss. In this way, the interactions between users and items can be represented by the distances in the latent space. WebThe within- and cross-domain graph contrastive learning is carried out by optimizing an objective function, which combines source classifier and target classifier loss, domain-specific contrastive loss, and cross-domain contrastive loss. As a result, feature learning from graphs is facilitated using knowledge transferred between graphs.
Disentangled Contrastive Learning for Cross-Domain …
WebApr 7, 2024 · In this paper, we propose a Contrastive Zero-Shot Learning with Adversarial Attack (CZSL-Adv) method for the cross-domain slot filling. The contrastive loss aims to map slot value contextual … Web14 hours ago · Recently, cross-domain named entity recognition (cross-domain NER), which can reduce the high data annotation costs faced by fully-supervised methods, has drawn attention. Most competitive approaches mainly rely on pre-trained language models like BERT to represent... boom phonics
arXiv.org e-Print archive
WebOct 22, 2024 · We address both challenges by introducing: 1) a new cluster-wise contrastive learning mechanism to help extract class semantic-aware features, and 2) a novel distance-of-distance loss to effectively measure and minimize the domain discrepancy without any external supervision. WebApr 9, 2024 · “Cross-Domain Graph Anomaly Detection via Anomaly-aware Contrastive Alignment.” arXiv preprint arXiv:2212.01096 (2024). To appear in Proceedings of AAAI 2024. To appear in Proceedings of AAAI ... WebJun 10, 2024 · In this work, we build upon contrastive self-supervised learning to align features so as to reduce the domain discrepancy between training and testing sets. Exploring the same set of... haslet county mi