From 4173bfa167cd31199a1b19565a61513227023945 Mon Sep 17 00:00:00 2001 From: Yue Duan <102155262+NJUyued@users.noreply.github.com> Date: Wed, 21 Dec 2022 19:41:30 +0800 Subject: [PATCH] Update README.md --- README.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index a16483f..2061618 100644 --- a/README.md +++ b/README.md @@ -4,13 +4,17 @@ This repo is the official Pytorch implementation of our paper: > ***MutexMatch: Semi-supervised Learning with Mutex-based Consistency Regularization*** Authors: Yue Duan, Lei Qi, Lei Wang, Luping Zhou and Yinghuan Shi. -[[arXiv](https://arxiv.org/abs/2203.14316) | [Paper](https://ieeexplore.ieee.org/document/9992211) | [code](https://github.com/NJUyued/MutexMatch4SSL/archive/refs/heads/master.zip)] +[[arXiv](https://arxiv.org/abs/2203.14316) | [Published paper](https://ieeexplore.ieee.org/document/9992211) | [Code download](https://github.com/NJUyued/MutexMatch4SSL/archive/refs/heads/master.zip)] - Latest news: - Our paper is accepted by IEEE Transactions on Neural Networks and Learning Systems (TNNLS) 📕📕. - Related works: - 🆕 Interested in robust SSL with mismatched distributions or more applications of complementary label in SSL? Check out our ECCV'22 paper **RDA**. [[arXiv](https://arxiv.org/abs/2208.04619) | [Repo](https://github.com/NJUyued/RDA4RobustSSL)] +## Introduction + +The core issue in semi-supervised learning (SSL) lies in how to effectively leverage unlabeled data, whereas most existing methods tend to put a great emphasis on the utilization of high-confidence samples yet seldom fully explore the usage of *low-confidence samples*. In this article, we aim to utilize low-confidence samples in a novel way with our proposed mutex-based consistency regularization, namely **MutexMatch**. Specifically, the high-confidence samples are required to exactly predict *"what it is"* by the conventional true-positive classifier (TPC), while low-confidence samples are employed to achieve a simpler goal — to predict with ease *"what it is not"* by the true-negative classifier (TNC). In this sense, we not only mitigate the pseudo-labeling errors but also make full use of the low-confidence unlabeled data by the consistency of dissimilarity degree. + ## Requirements - matplotlib==3.3.2 - numpy==1.19.2