MUSCLE: Strengthening Semi-Supervised Learning Via Concurrent Unsupervised Learning Using Mutual Information Maximization. Xie, H., Hussein, M. E., Galstyan, A., & AbdAlmageed, W. IEEE Winter Conference on Applications of Computer Vision, Jan., 2021.
MUSCLE: Strengthening Semi-Supervised Learning Via Concurrent Unsupervised Learning Using Mutual Information Maximization [link]Paper  abstract   bibtex   6 downloads  
Deep neural networks are powerful, massively parameterized machine learning models that have been shown to perform well in supervised learning tasks. However, very large amounts of labeled data are usually needed to train deep neural networks. Several semi-supervised learning approaches have been proposed to train neural networks using smaller amounts of labeled data with a large amount of unlabeled data. The performance of these semi-supervised methods significantly degrades as the size of labeled data decreases. We introduce Mutual-information-based Unsupervised & Semi-supervised Concurrent LEarning (MUSCLE), a hybrid learning approach that uses mutual information to combine both unsupervised and semi-supervised learning. MUSCLE can be used as a stand-alone training scheme for neural networks, and can also be incorporated into other learning approaches. We show that the proposed hybrid model outperforms state of the art on several standard benchmarks, including CIFAR-10, CIFAR-100, and Mini-Imagenet. Furthermore, the performance gain consistently increases with the reduction in the amount of labeled data, as well as in the presence of bias. We also show that MUSCLE has the potential to boost the classification performance when used in the fine-tuning phase for a model pre-trained only on unlabeled data.
@article{xie_muscle_2021,
	title = {{MUSCLE}: {Strengthening} {Semi}-{Supervised} {Learning} {Via} {Concurrent} {Unsupervised} {Learning} {Using} {Mutual} {Information} {Maximization}},
	shorttitle = {{MUSCLE}},
	url = {http://arxiv.org/abs/2012.00150},
	abstract = {Deep neural networks are powerful, massively parameterized machine learning models that have been shown to perform well in supervised learning tasks. However, very large amounts of labeled data are usually needed to train deep neural networks. Several semi-supervised learning approaches have been proposed to train neural networks using smaller amounts of labeled data with a large amount of unlabeled data. The performance of these semi-supervised methods significantly degrades as the size of labeled data decreases. We introduce Mutual-information-based Unsupervised \& Semi-supervised Concurrent LEarning (MUSCLE), a hybrid learning approach that uses mutual information to combine both unsupervised and semi-supervised learning. MUSCLE can be used as a stand-alone training scheme for neural networks, and can also be incorporated into other learning approaches. We show that the proposed hybrid model outperforms state of the art on several standard benchmarks, including CIFAR-10, CIFAR-100, and Mini-Imagenet. Furthermore, the performance gain consistently increases with the reduction in the amount of labeled data, as well as in the presence of bias. We also show that MUSCLE has the potential to boost the classification performance when used in the fine-tuning phase for a model pre-trained only on unlabeled data.},
	
	journal = {IEEE Winter Conference on Applications of Computer Vision},
	author = {Xie, Hanchen and Hussein, Mohamed E. and Galstyan, Aram and AbdAlmageed, Wael},
	month = "Jan.",
	year = {2021},
	file = {arXiv Fulltext PDF:/Users/wamageed/Zotero/storage/W2ZZSQI4/Xie et al. - 2020 - MUSCLE Strengthening Semi-Supervised Learning Via.pdf:application/pdf;arXiv.org Snapshot:/Users/wamageed/Zotero/storage/IZKD4EFW/2012.html:text/html},
	keywords="conference"
}

Downloads: 6