Lightweight transformers for clinical natural language processing

ISARIC Clinical Characterisation Group

Producción científica: Contribución a una revistaArtículorevisión exhaustiva

2 Citas (Scopus)

Resumen

Specialised pre-trained language models are becoming more frequent in Natural language Processing (NLP) since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv preprint arXiv: 1910.01108, 2019) and BioClinicalBERT (Alsentzer et al., Publicly available clinical bert embeddings. In Proceedings of the 2nd Clinical Natural Language Processing Workshop, pp. 72–78, 2019) are two examples of such models that have shown promise in medical NLP tasks. Many of these models are overparametrised and resource-intensive, but thanks to techniques like knowledge distillation, it is possible to create smaller versions that perform almost as well as their larger counterparts. In this work, we specifically focus on development of compact language models for processing clinical texts (i.e. progress notes, discharge summaries, etc). We developed a number of efficient lightweight clinical transformers using knowledge distillation and continual learning, with the number of parameters ranging from 15 million to 65 million. These models performed comparably to larger models such as BioBERT and ClinicalBioBERT and significantly outperformed other compact models trained on general or biomedical data. Our extensive evaluation was done across several standard datasets and covered a wide range of clinical text-mining tasks, including natural language inference, relation extraction, named entity recognition and sequence classification. To our knowledge, this is the first comprehensive study specifically focused on creating efficient and compact transformers for clinical NLP tasks. The models and code used in this study can be found on our Huggingface profile at https://huggingface.co/nlpie and Github page at https://github.com/nlpieresearch/Lightweight-Clinical-Transformers, respectively, promoting reproducibility of our results.

Idioma originalInglés
Páginas (desde-hasta)887-914
Número de páginas28
PublicaciónNatural Language Engineering
Volumen30
N.º5
DOI
EstadoPublicada - 2024

Nota bibliográfica

Publisher Copyright:
© The Author(s), 2024.

Financiación

FinanciadoresNúmero del financiador
Institute for Clinical Research
University of Capetown
European Clinical Research Alliance on Infectious Diseases
Kementerian Kesihatan Malaysia
Instituto de Salud Carlos III
Institut national de la santé et de la recherche médicale
Artificial Intelligence for Pandemics
Groote Schuur Hospital Covid ICU
Irish Critical Care- Clinical Trials Group
RAEng Research Chair
Common Good
National Institutes of Health
South Eastern Norway Health Authority and the Research Council of Norway
National Institute for Health Research Health Protection Research Unit
Ministero della Salute
University of Queensland
Manipal Hospital Whitefield
foundation Bevordering Onderzoek Franciscus
University of Liverpool
Manchester Biomedical Research Centre
InnoHK
University of Oxford
Centre for Cerebro-cardiovascular Health Engineering
Foreign, Commonwealth and Development Office
UK Research and Innovation
Bill and Melinda Gates FoundationOPP1209135
Medical Research CouncilMR/W01761X/, MC_PC_19059
Public Health England200907
National Institute for Health and Care ResearchCO-CIN-01
Wellcome Trust220757, 220757/Z/20/Z, 222048/Z/20/Z, 225288, 215091/Z/18/Z, 225288/Z/22/Z, 222410/Z/21/Z, 222410, 205228/Z/16/Z, 215091
Imperial’s Health Protection Research UnitNIHR201385
Horizon 2020 Framework Programme965313, 101003589
Canadian Institutes of Health ResearchOV2170359
Seventh Framework ProgrammeAPCOV22BGM
Australian Department of Health3273191
Liverpool Experimental Cancer Medicine CentreC18616/A25153
Norges Forskningsråd312780
Firland FoundationNCT04262921
COCHEMR/W01761X/
French Ministry of HealthPHRC n20-0424
Innovative Medicines Initiative115523
Imperial College London200927
Australian Research CouncilCE170100009
Ministerio de Ciencia303953/2018-7
Health Research BoardCTN-2014-12
HPRUNIHR200907
Oxford University COVID-19 Research Response0009109
NIHR Biomedical Research Centre at Imperial College LondonISBRC-1215-20013

    Citar esto