Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-3-030-67832-6_28guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Few-Shot Learning with Unlabeled Outlier Exposure

Published: 22 June 2021 Publication History

Abstract

Few-shot learning aims to train a classifier which can recognize a new class from a few examples like a human. Recently, some works have leveraged auxiliary information in few-shot learning, such as textual data, unlabeled visual data. But these data are positive data, they are close related to the training data. Different from such experimental settings, people can also get knowledge from negative data to better recognize a new class. Inspired by this, we exposure a few unlabeled outliers in each few-shot learning tasks to assist the learning of the classifier. To the best of our knowledge, we are the first ones who propose to utilize outliers to improve few-shot learning. We propose a novel method based on meta-learning paradigms to utilize unlabeled outliers. We not only utilize unlabeled outliers to optimize the meta-embedding network but also adaptively leverage them to enhance the class prototypes. Experiments show that our outlier exposure network can improve few-shot learning performance with a few unlabeled outliers exposure.

References

[1]
Finn, C., Abbeel, P., Levine, S.: Model-agnostic meta-learning for fast adaptation of deep networks. In: Proceedings of the 34th International Conference on Machine Learning, ICML, vol. 70, pp. 1126–1135 (2017)
[2]
Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier neural networks. In: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, AISTATS, vol. 15, pp. 315–323 (2011)
[3]
Hendrycks, D., Mazeika, M., Dietterich, T.G.: Deep anomaly detection with outlier exposure. In: 7th International Conference on Learning Representations, ICLR (2019)
[4]
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: 3rd International Conference on Learning Representations, ICLR (2015)
[5]
Kohonen, T.: Learning vector quantization. In: Self-Organizing Maps, pp. 175–189 (1995)
[6]
LeCun Y, Bengio Y, and Hinton G Deep learning Nature 2015 521 7553 436-444
[7]
Li, Z., Zhou, F., Chen, F., Li, H.: Meta-SGD: learning to learn quickly for few-shot learning. arXiv preprint arXiv:1707.09835 (2017)
[8]
Oreshkin, B., López, P.R., Lacoste, A.: TADAM: task dependent adaptive metric for improved few-shot learning. In: Advances in Neural Information Processing Systems, NeurIPS, pp. 721–731 (2018)
[9]
Ravi, S., Larochelle, H.: Optimization as a model for few-shot learning. In: 5th International Conference on Learning Representations, ICLR (2017)
[10]
Ren, M., et al.: Meta-learning for semi-supervised few-shot classification. In: 6th International Conference on Learning Representations, ICLR (2018)
[11]
Russakovsky O et al. ImageNet large scale visual recognition challenge Int. J. Comput. Vis. 2015 115 3 211-252
[12]
Rusu, A.A., et al: Meta-learning with latent embedding optimization. In: 7th International Conference on Learning Representations, ICLR (2019)
[13]
Simon, C., Koniusz, P., Nock, R., Harandi, M.: Adaptive subspaces for few-shot learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR, pp. 4136–4145 (2020)
[14]
Snell, J., Swersky, K., Zemel, R.: Prototypical networks for few-shot learning. In: Advances in Neural Information Processing Systems, NeurIPS, pp. 4077–4087 (2017)
[15]
Srivastava N, Hinton GE, Krizhevsky A, Sutskever I, and Salakhutdinov R Dropout: a simple way to prevent neural networks from overfitting J. Mach. Learn. Res. 2014 15 1 1929-1958
[16]
Sung, F., Yang, Y., Zhang, L., Xiang, T., Torr, P.H., Hospedales, T.M.: Learning to compare: relation network for few-shot learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR, pp. 1199–1208 (2018)
[17]
Vinyals, O., Blundell, C., Lillicrap, T., Wierstra, D.: Matching networks for one shot learning. In: Advances in Neural Information Processing Systems, pp. 3630–3638 (2016)
[18]
Xing, C., Rostamzadeh, N., Oreshkin, B.N., Pinheiro, P.O.: Adaptive cross-modal few-shot learning. In: Advances in Neural Information Processing Systems, NeurIPS, pp. 4848–4858. Curran Associates, Inc. (2019)
[19]
Ye, H.J., Hu, H., Zhan, D.C., Sha, F.: Few-shot learning via embedding adaptation with set-to-set functions. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR, pp. 8808–8817 (2020)
[20]
Zhang, R., Che, T., Ghahramani, Z., Bengio, Y., Song, Y.: MetaGAN: an adversarial approach to few-shot learning. In: Advances in Neural Information Processing Systems, NeurIPS, pp. 2371–2380 (2018)

Index Terms

  1. Few-Shot Learning with Unlabeled Outlier Exposure
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Guide Proceedings
      MultiMedia Modeling: 27th International Conference, MMM 2021, Prague, Czech Republic, June 22–24, 2021, Proceedings, Part I
      Jun 2021
      756 pages
      ISBN:978-3-030-67831-9
      DOI:10.1007/978-3-030-67832-6

      Publisher

      Springer-Verlag

      Berlin, Heidelberg

      Publication History

      Published: 22 June 2021

      Author Tags

      1. Few-shot learning
      2. Outlier exposure
      3. Meta-learning

      Qualifiers

      • Article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 0
        Total Downloads
      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 01 Sep 2024

      Other Metrics

      Citations

      View Options

      View options

      Get Access

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media