Abstract
Oracle character recognition has recently made significant progress with the success of deep neural networks (DNNs), but it is far from being solved. Most works do not consider the long-tailed distribution issue in oracle character recognition, resulting in a biased DNN towards head classes. To overcome this issue, we propose a two-stage decoupled learning method to train an unbiased DNN model for long-tailed oracle character recognition. In the first stage, we optimize the DNN under instance-balanced sampling, obtaining a robust backbone but biased classifier. In the second stage, we propose two strategies to refine the classifier under class-balanced sampling. Specifically, we add a learnable weight scaling module which can adjust the classifier to respect tail classes; meanwhile, we integrate the KL-divergence loss to maintain attention to head classes through knowledge distillation from the first stage. Coupling these two designs enables us to train an unbiased DNN model in oracle character recognition. Our proposed method achieves new state-of-the-art performance on three benchmark datasets, including OBC306, Oracle-AYNU and Oracle-20K.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
We divide the oracle data into three categories: the classes with many samples as head classes, the classes with few samples as tail classes, and the remainder are the medium classes described in Sect. 4.3.
References
Guo, J., Wang, C., Roman-Rangel, E., et al.: Building hierarchical representations for oracle character and sketch recognition. IEEE Trans. Image Process. 1, 104–118 (2016)
Huang, S., Wang, H., Liu, Y., et al.: OBC306: a large-scale oracle bone character recognition dataset. In: Proceedings of International Conference on Document Analysis and Recognition (ICDAR), pp. 681–688 (2019)
Zhang, Y.-K., Zhang, H., Liu, Y.-G., et al.: Oracle character recognition by nearest neighbor classification with deep metric learning. In: Proceedings of International Conference on Document Analysis and Recognition (ICDAR), pp. 309–314 (2019)
Huang, K., Hussain, A., Wang, Q.-F., Zhang, R.: Deep Learning: Fundamentals, Theory and Applications, vol. 2 (2019)
Li, J., Wang, Q.-F., Zhang, R., Huang, K.: Mix-up augmentation for oracle character recognition with imbalanced data distribution. In: Proceedings of International Conference on Document Analysis and Recognition (ICDAR), pp. 237–251 (2021)
Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: SMOTE: synthetic minority over-sampling technique. J. Artif. Intell. Res. 16, 321–357 (2002)
Estabrooks, A., Jo, T., Japkowicz, N.: A multiple resampling method for learning from imbalanced data sets. Comput. Intell. 20(1), 18–36 (2004)
Cui, Y., Jia, M., Lin, T.-Y., et al.: Class-balanced loss based on effective number of samples. In: Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR), pp. 9268–9277 (2019)
Cao, K., Wei, C., Gaidon, A., et al.: Learning imbalanced datasets with label-distribution-aware margin loss. In: Proceedings of Advances in Neural Information Processing Systems (NeurIPS), pp. 1565–1576 (2019)
Menon, A.K., Jayasumana, S., Rawat, A.S., et al.: Long-tail learning via logit adjustment. In: Proceedings of International Conference on Learning Representations (ICLR) (2021)
Wu, T., Liu, Z., Huang, Q., et al.: Adversarial robustness under long-tailed distribution. In: Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR), pp. 8659–8668 (2021)
Kang, B., Xie, S., Rohrbach,, M., et al.: Decoupling representation and classifier for long-tailed recognition. In: Proceedings of International Conference on Learning Representations (ICLR) (2020)
Kang, B., Li, Y., Xie, S., et al.: Exploring balanced feature spaces for representation learning. In: Proceedings of International Conference on Learning Representations (ICLR) (2021)
Zhong, Z., Cui, J., Liu, S., Jia, J.: Improving calibration for long-tailed recognition. In: Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR), pp. 16489–16498 (2021)
Alshammari, S., Wang, Y.-X., Ramanan, D., Kong S.: Long-tailed recognition via weight balancing. In: Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR), pp. 6887–6897 (2022)
Dosovitskiy, A., Beyer, L., Kolesnikov, A., et al.: An image is worth 16x16 words: transformers for image recognition at scale. In: Proceedings of International Conference on Learning Representations (ICLR) (2021)
Zhang, H., Cissé, M., Dauphin, Y.N., Lopez-Paz, D.: mixup: beyond empirical risk minimization. In: Proceedings of International Conference on Learning Representations (ICLR) (2018)
Li, Q., Yang, Y., Wang, A.: Recognition of inscriptions on bones or tortoise shells based on graph isomorphism. Jisuanji Gongcheng yu Yingyong (Comput. Eng. Appl.) 47(8), 112–114 (2011)
Liu, Y., Liu, G.: Oracle bone inscription recognition based on SVM. J. Anyang Normal Univ. 2, 54–56 (2017)
Wang, X., Lian, L., Miao, Z., et al.: Long-tailed recognition by routing diverse distribution-aware experts. In: Proceedings of International Conference on Learning Representations (ICLR) (2021)
Zhang, Y., Hooi, B., Hong, L., Feng, J.: Self-supervised aggregation of diverse experts for test-agnostic long-tailed recognition. In: Advances in Neural Information Processing Systems (NeurIPS) (2022)
Gou, J., Baosheng, Yu., Maybank, S.J., Tao, D.: Knowledge distillation: a survey. Int. J. Comput. Vision 129(6), 1789–1819 (2021)
Hinton, G.E., Vinyals, O., Dean, J.: Distilling the knowledge in a neural network (2015)
Xiang, L., Ding, G., Han, J.: Learning from multiple experts: self-paced knowledge distillation for long-tailed classification. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12350, pp. 247–263. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58558-7_15
Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT), pp. 4171–4186 (2019)
He, K., Chen, X., Xie, S., et al.: Masked autoencoders are scalable vision learners. In: Proceedings of Conference on Computer Vision and Pattern Recognition (CVPR), pp. 15979–15988 (2022)
Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Proceedings of Advances in Neural Information Processing Systems (NeurIPS), pp. 5998–6008 (2017)
Verma, V., Lamb, A., Beckham, C., et al.: Manifold mixup: better representations by interpolating hidden states. In: Proceedings of International Conference on Machine Learning (ICML), pp. 6438–6447 (2019)
Yun, S., Han, D., Chun, S., et al.: Cutmix: regularization strategy to train strong classifiers with localizable features. In: Proceedings of International Conference on Computer Vision (ICCV), pp. 6022–6031 (2019)
Zhang, Y., Wei, X.-S., Zhou, B., Wu, J.: Bag of tricks for long-tailed visual recognition with deep convolutional neural networks. In: Proceedings of Association for the Advancement of Artificial Intelligence (AAAI), pp. 3447–3455 (2021)
Acknowledgements
This research was funded by National Natural Science Foundation of China (NSFC) no.62276258, Jiangsu Science and Technology Programme (Natural Science Foundation of Jiangsu Province) no. BE2020006-4, and Xi’an Jiaotong-Liverpool University’s Key Program Special Fund no. KSF-T-06.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Li, J., Dong, B., Wang, QF., Ding, L., Zhang, R., Huang, K. (2023). Decoupled Learning for Long-Tailed Oracle Character Recognition. In: Fink, G.A., Jain, R., Kise, K., Zanibbi, R. (eds) Document Analysis and Recognition - ICDAR 2023. ICDAR 2023. Lecture Notes in Computer Science, vol 14190. Springer, Cham. https://doi.org/10.1007/978-3-031-41685-9_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-41685-9_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-41684-2
Online ISBN: 978-3-031-41685-9
eBook Packages: Computer ScienceComputer Science (R0)