Bridging knowledge distillation gap for few-sample unsupervised semantic segmentation
References
Recommendations
CXR Segmentation by AdaIN-Based Domain Adaptation and Knowledge Distillation
Computer Vision – ECCV 2022AbstractAs segmentation labels are scarce, extensive researches have been conducted to train segmentation networks with domain adaptation, semi-supervised or self-supervised learning techniques to utilize abundant unlabeled dataset. However, these ...
FSKD: Detecting Fake News with Few-Shot Knowledge Distillation
Advanced Data Mining and ApplicationsAbstractThe detection of fake news on social networks is highly desirable and socially beneficial. In real scenarios, there are few labeled news articles and a large number of unlabeled articles. One prominent way is to consider fake news detection as a ...
Inter-image Discrepancy Knowledge Distillation for Semantic Segmentation
Pattern Recognition and Computer VisionAbstractAs a typical dense prediction task, semantic segmentation remains challenging in industrial automation, since it is non-trivial to achieve a good tradeoff between the performance and the efficiency. Meanwhile, knowledge distillation has been ...
Comments
Information & Contributors
Information
Published In
Publisher
Elsevier Science Inc.
United States
Publication History
Author Tags
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0
Other Metrics
Citations
View Options
View options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in