Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3490322.3490336acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicbdtConference Proceedingsconference-collections
research-article

Medical Named Entity Recognition Based on Multi Feature Fusion of BERT

Published: 27 December 2021 Publication History

Abstract

In order to solve the problem that traditional word vectors are difficult to express the context semantics and the feature extraction of traditional model is single, a multi-feature fusion model named BERT-BiLSTM-IDCNN-Attention-CRF for Named Entity Recognition is proposed, which uses BERT to model the context semantic relationship of word vectors and fuse the context features and local features extracted by BiLSTM and IDCNN respectively. The proposed model is tested on Chinese Electronic Medical Record (EMR) dataset issued by China Conference on Knowledge Graph and Semantic Computing 2020 (CCKS2020).Compared with the baseline models such as BiLSTM-CRF, the experiment on CCKS2020 data shows that BERT-BiLSTM-IDCNN-Attention-CRF achieves 1.27% improvement in F1. The experimental results show that the proposed model can better identify the medical entities in EMR.

References

[1]
Huang Jian-ying. Development of Electronic Medical Records Management Trend[J]. Medical Recapitulate, 2009,15(13):2078-2080.
[2]
Lin Li, YUN Hong-yan, HE Ying, Research on Visualization System Based on Enterprises Knowledge Graph Construction[J]. Journal of Qingdao University (Natural Science Edition), 2019, 32(01): 55-60
[3]
Bikel D M,Miller S,Schwartz R,et al. Nymble:a high-performance learning name-finder[J]. Anlp,1998,4:194∼201.
[4]
Bikel D M, Schwartz R, Weischedel R M. An algorithm that learns what's in a name[J]. Machine learning, 1999, 34(1): 211-231.
[5]
Yang H, Li L, Yang R, Named entity recognition based on bidirectional long short-term memory combined with case report form[J]. Chinese Journal of Tissue Engineering Research, 2018, 22(20): 3237.
[6]
Chiu J P C, Nichols E. Named entity recognition with bidirectional LSTM-CNNs[J]. Transactions of the Association for Computational Linguistics, 2016, 4: 357-370.
[7]
Strubell E, Verga P, Belanger D, Fast and accurate entity recognition with iterated dilated convolutions[J]. arXiv preprint arXiv:1702.02098, 2017.
[8]
Yin M, Mou C, Xiong K, Chinese clinical named entity recognition with radical-level feature and self-attention mechanism[J]. Journal of biomedical informatics, 2019, 98: 103289.
[9]
Li Y, Yang T. Word embedding for understanding natural language: a survey[M]//Guide to big data applications. Springer, Cham, 2018: 83-104.
[10]
Mikolov T, Chen K, Corrado G, Efficient estimation of word representations in vector space[J]. arXiv preprint arXiv:1301.3781, 2013.
[11]
Pennington J, Socher R, Manning C D. Glove: Global vectors for word representation[C]//Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). 2014: 1532-1543.
[12]
Devlin J, Chang M W, Lee K, Bert: Pre-training of deep bidirectional transformers for language understanding[J]. arXiv preprint arXiv:1810.04805, 2018.
[13]
Peters M E, Neumann M, Iyyer M, Deep contextualized word representations[J]. arXiv preprint arXiv:1802.05365, 2018.
[14]
Radford A, Narasimhan K, Salimans T, Improving language understanding by generative pre-training[J]. 2018.
[15]
Liu Y, Ott M, Goyal N, Roberta: A robustly optimized bert pretraining approach[J]. arXiv preprint arXiv:1907.11692, 2019.
[16]
Jawahar G, Sagot B, Seddah D. What does BERT learn about the structure of language?[C]//ACL 2019-57th Annual Meeting of the Association for Computational Linguistics. 2019.
[17]
Hochreiter S, Schmidhuber J. Long short-term memory[J]. Neural computation, 1997, 9(8): 1735-1780.
[18]
Strubell E, Verga P, Belanger D, Fast and accurate entity recognition with iterated dilated convolutions[J]. arXiv preprint arXiv:1702.02098, 2017.
[19]
Lin G, Liu F, Milan A, Refinenet: Multi-path refinement networks for dense prediction[J]. IEEE transactions on pattern analysis and machine intelligence, 2019, 42(5): 1228-1242.
[20]
Vaswani A, Shazeer N, Parmar N, Attention is all you need[J]. arXiv preprint arXiv:1706.03762, 2017.
[21]
Yan H, Deng B, Li X, Tener: Adapting transformer encoder for named entity recognition[J]. arXiv preprint arXiv:1911.04474, 2019.
[22]
Lafferty J, McCallum A, Pereira F C N. Conditional random fields: Probabilistic models for segmenting and labeling sequence data[J]. 2001.
[23]
Liu Z, Chen Y, Tang B, Automatic de-identification of electronic medical records using token-level and character-level conditional random fields[J]. Journal of biomedical informatics, 2015, 58: S47-S52.

Cited By

View all
  • (2023)MFF-CNER: A Multi-feature Fusion Model for Chinese Named Entity Recognition in Finance SecuritiesAcademic Journal of Science and Technology10.54097/ajst.v7i3.127157:3(40-49)Online publication date: 27-Oct-2023
  • (2023)End-to-End Transformer-Based Models in Textual-Based NLPAI10.3390/ai40100044:1(54-110)Online publication date: 5-Jan-2023
  • (2023)A Method for Extracting Electronic Medical Record Entities by Fusing Multichannel Self-Attention Mechanism with Location Relationship FeaturesData Science10.1007/978-981-99-5971-6_2(13-30)Online publication date: 15-Sep-2023
  • Show More Cited By

Index Terms

  1. Medical Named Entity Recognition Based on Multi Feature Fusion of BERT
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      ICBDT '21: Proceedings of the 4th International Conference on Big Data Technologies
      September 2021
      189 pages
      ISBN:9781450385091
      DOI:10.1145/3490322
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 27 December 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. BERT
      2. BiLSTM
      3. CCKS2020
      4. IDCNN
      5. Named Entity Recognition
      6. multi feature fusion

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      ICBDT 2021

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)33
      • Downloads (Last 6 weeks)6
      Reflects downloads up to 10 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)MFF-CNER: A Multi-feature Fusion Model for Chinese Named Entity Recognition in Finance SecuritiesAcademic Journal of Science and Technology10.54097/ajst.v7i3.127157:3(40-49)Online publication date: 27-Oct-2023
      • (2023)End-to-End Transformer-Based Models in Textual-Based NLPAI10.3390/ai40100044:1(54-110)Online publication date: 5-Jan-2023
      • (2023)A Method for Extracting Electronic Medical Record Entities by Fusing Multichannel Self-Attention Mechanism with Location Relationship FeaturesData Science10.1007/978-981-99-5971-6_2(13-30)Online publication date: 15-Sep-2023
      • (2022)Research on Dual-channel Text Feature Extraction Method Based on Neural Network2022 4th International Conference on Artificial Intelligence and Advanced Manufacturing (AIAM)10.1109/AIAM57466.2022.00083(405-410)Online publication date: Oct-2022

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media