Universal Semi-supervised Model Adaptation via Collaborative Consistency Training

Type: Article

Publication Date: 2024-01-03

Citations: 0

DOI: https://doi.org/10.1109/wacv57701.2024.00092

Abstract

In this paper, we introduce a realistic and challenging domain adaptation problem called Universal Semi-supervised Model Adaptation (USMA), which i) requires only a pre-trained source model, ii) allows the source and target domain to have different label sets, i.e., they share a common label set and hold their own private label set, and iii) requires only a few labeled samples in each class of the target domain. To address USMA, we propose a collaborative consistency training framework that regularizes the prediction consistency between two models, i.e., a pre-trained source model and its variant pre-trained with target data only, and combines their complementary strengths to learn a more powerful model. The rationale of our framework stems from the observation that the source model performs better on common categories than the target-only model, while on target-private categories, the target-only model performs better. We also propose a two-perspective, i.e., sample-wise and class-wise, consistency regularization to improve the training. Experimental results demonstrate the effectiveness of our method on several benchmark datasets.

Locations

  • arXiv (Cornell University) - View - PDF
  • ORCA Online Research @Cardiff (Cardiff University) - View - PDF
  • 2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) - View

Similar Works

Action Title Year Authors
+ Universal Semi-supervised Model Adaptation via Collaborative Consistency Training 2023 Zizheng Yan
Yushuang Wu
Yipeng Qin
Xiaoguang Han
Shuguang Cui
Guanbin Li
+ PDF Chat Multi-level Consistency Learning for Semi-supervised Domain Adaptation 2022 Zizheng Yan
Yushuang Wu
Guanbin Li
Yipeng Qin
Xiaoguang Han
Shuguang Cui
+ Multi-level Consistency Learning for Semi-supervised Domain Adaptation 2022 Zizheng Yan
Yushuang Wu
Guanbin Li
Yipeng Qin
Xiaoguang Han
Shuguang Cui
+ Towards Realizing the Value of Labeled Target Samples: a Two-Stage Approach for Semi-Supervised Domain Adaptation 2023 mengqun Jin
Kai Li
Shuyan Li
Chunming He
Xiu Li
+ Towards Realizing the Value of Labeled Target Samples: A Two-Stage Approach for Semi-Supervised Domain Adaptation 2023 Mengqun Jin
Kai Li
Shuyan Li
Chunming He
Xiu Li
+ Pred&Guide: Labeled Target Class Prediction for Guiding Semi-Supervised Domain Adaptation 2022 Megh Manoj Bhalerao
Anurag Singh
Soma Biswas
+ PDF Chat Universal Semi-Supervised Domain Adaptation by Mitigating Common-Class Bias 2024 Wenyu Zhang
Qingmu Liu
Felix Ong Wei Cong
Mohamed Ragab
Chuan-Sheng Foo
+ Semi-supervised Domain Adaptation via Prototype-based Multi-level Learning 2023 X. T. Huang
Chuang Zhu
Wenkai Chen
+ PDF Chat Learning from Different Samples: A Source-free Framework for Semi-supervised Domain Adaptation 2024 X. T. Huang
Chuang Zhu
Bowen Zhang
Shanghang Zhang
+ PDF Chat Source-free Semantic Regularization Learning for Semi-supervised Domain Adaptation 2025 X. T. Huang
Chuang Zhu
Ruiying Ren
Shengjie Liu
Tiejun Huang
+ Semi-supervised Domain Adaptation via Prototype-based Multi-level Learning 2023 X. T. Huang
Chuang Zhu
Wenkai Chen
+ Confidence Score for Source-Free Unsupervised Domain Adaptation 2022 Jonghyun Lee
Dahuin Jung
Junho Yim
Sungroh Yoon
+ Strong-Weak Integrated Semi-supervision for Unsupervised Single and Multi Target Domain Adaptation 2023 Xiaohu Lu
Hayder Radha
+ Unveiling Class-Labeling Structure for Universal Domain Adaptation 2020 Yueming Yin
Zhen Yang
Xiaofu Wu
Haifeng Hu
+ PDF Chat Incremental Pseudo-Labeling for Black-Box Unsupervised Domain Adaptation 2024 Yawen Zou
Chunzhi Gu
Jun Yu
Shangce Gao
Chao Zhang
+ Gradual Domain Adaptation via Self-Training of Auxiliary Models 2021 Yabin Zhang
Bin Deng
Kui Jia
Lei Zhang
+ Semi-Supervised Domain Adaptation with Source Label Adaptation 2023 Yu-Chu Yu
Hsuan-Tien Lin
+ UMAD: Universal Model Adaptation under Domain and Category Shift 2021 Jian Liang
Dapeng Hu
Jiashi Feng
Ran He
+ PDF Chat Semi-Supervised Domain Adaptation with Source Label Adaptation 2023 Yu-Chu Yu
Hsuan-Tien Lin
+ Learning Invariant Representation with Consistency and Diversity for Semi-supervised Source Hypothesis Transfer 2021 Xiaodong Wang
Junbao Zhuo
Shuhao Cui
Shuhui Wang

Works That Cite This (0)

Action Title Year Authors