S-NER: A Concise and Efficient Span-Based Model for Named Entity Recognition.
BERT
named entity recognition
span-based model
Journal
Sensors (Basel, Switzerland)
ISSN: 1424-8220
Titre abrégé: Sensors (Basel)
Pays: Switzerland
ID NLM: 101204366
Informations de publication
Date de publication:
08 Apr 2022
08 Apr 2022
Historique:
received:
21
02
2022
revised:
30
03
2022
accepted:
31
03
2022
entrez:
23
4
2022
pubmed:
24
4
2022
medline:
27
4
2022
Statut:
epublish
Résumé
Named entity recognition (NER) is a task that seeks to recognize entities in raw texts and is a precondition for a series of downstream NLP tasks. Traditionally, prior NER models use the sequence labeling mechanism which requires label dependency captured by the conditional random fields (CRFs). However, these models are prone to cascade label misclassifications since a misclassified label results in incorrect label dependency, and so some following labels may also be misclassified. To address the above issue, we propose S-NER, a span-based NER model. To be specific, S-NER first splits raw texts into text spans and regards them as candidate entities; it then directly obtains the types of spans by conducting entity type classifications on span semantic representations, which eliminates the requirement for label dependency. Moreover, S-NER has a concise neural architecture in which it directly uses BERT as its encoder and a feed-forward network as its decoder. We evaluate S-NER on several benchmark datasets across three domains. Experimental results demonstrate that S-NER consistently outperforms the strongest baselines in terms of F1-score. Extensive analyses further confirm the efficacy of S-NER.
Identifiants
pubmed: 35458837
pii: s22082852
doi: 10.3390/s22082852
pmc: PMC9030542
pii:
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Subventions
Organisme : National Key Research and development Program
ID : 2018YFB1004502
Références
Neural Comput. 1997 Nov 15;9(8):1735-80
pubmed: 9377276