Resumen
Specialised pre-trained language models are becoming more frequent in Natural language Processing (NLP) since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv preprint arXiv: 1910.01108, 2019) and BioClinicalBERT (Alsentzer et al., Publicly available clinical bert embeddings. In Proceedings of the 2nd Clinical Natural Language Processing Workshop, pp. 72–78, 2019) are two examples of such models that have shown promise in medical NLP tasks. Many of these models are overparametrised and resource-intensive, but thanks to techniques like knowledge distillation, it is possible to create smaller versions that perform almost as well as their larger counterparts. In this work, we specifically focus on development of compact language models for processing clinical texts (i.e. progress notes, discharge summaries, etc). We developed a number of efficient lightweight clinical transformers using knowledge distillation and continual learning, with the number of parameters ranging from 15 million to 65 million. These models performed comparably to larger models such as BioBERT and ClinicalBioBERT and significantly outperformed other compact models trained on general or biomedical data. Our extensive evaluation was done across several standard datasets and covered a wide range of clinical text-mining tasks, including natural language inference, relation extraction, named entity recognition and sequence classification. To our knowledge, this is the first comprehensive study specifically focused on creating efficient and compact transformers for clinical NLP tasks. The models and code used in this study can be found on our Huggingface profile at https://huggingface.co/nlpie and Github page at https://github.com/nlpieresearch/Lightweight-Clinical-Transformers, respectively, promoting reproducibility of our results.
Idioma original | Inglés |
---|---|
Páginas (desde-hasta) | 887-914 |
Número de páginas | 28 |
Publicación | Natural Language Engineering |
Volumen | 30 |
N.º | 5 |
DOI | |
Estado | Publicada - 2024 |
Nota bibliográfica
Publisher Copyright:© The Author(s), 2024.
Financiación
Financiadores | Número del financiador |
---|---|
Institute for Clinical Research | |
University of Capetown | |
European Clinical Research Alliance on Infectious Diseases | |
Kementerian Kesihatan Malaysia | |
Instituto de Salud Carlos III | |
Institut national de la santé et de la recherche médicale | |
Artificial Intelligence for Pandemics | |
Groote Schuur Hospital Covid ICU | |
Irish Critical Care- Clinical Trials Group | |
RAEng Research Chair | |
Common Good | |
National Institutes of Health | |
South Eastern Norway Health Authority and the Research Council of Norway | |
National Institute for Health Research Health Protection Research Unit | |
Ministero della Salute | |
University of Queensland | |
Manipal Hospital Whitefield | |
foundation Bevordering Onderzoek Franciscus | |
University of Liverpool | |
Manchester Biomedical Research Centre | |
InnoHK | |
University of Oxford | |
Centre for Cerebro-cardiovascular Health Engineering | |
Foreign, Commonwealth and Development Office | |
UK Research and Innovation | |
Bill and Melinda Gates Foundation | OPP1209135 |
Medical Research Council | MR/W01761X/, MC_PC_19059 |
Public Health England | 200907 |
National Institute for Health and Care Research | CO-CIN-01 |
Wellcome Trust | 220757, 220757/Z/20/Z, 222048/Z/20/Z, 225288, 215091/Z/18/Z, 225288/Z/22/Z, 222410/Z/21/Z, 222410, 205228/Z/16/Z, 215091 |
Imperial’s Health Protection Research Unit | NIHR201385 |
Horizon 2020 Framework Programme | 965313, 101003589 |
Canadian Institutes of Health Research | OV2170359 |
Seventh Framework Programme | APCOV22BGM |
Australian Department of Health | 3273191 |
Liverpool Experimental Cancer Medicine Centre | C18616/A25153 |
Norges Forskningsråd | 312780 |
Firland Foundation | NCT04262921 |
COCHE | MR/W01761X/ |
French Ministry of Health | PHRC n20-0424 |
Innovative Medicines Initiative | 115523 |
Imperial College London | 200927 |
Australian Research Council | CE170100009 |
Ministerio de Ciencia | 303953/2018-7 |
Health Research Board | CTN-2014-12 |
HPRU | NIHR200907 |
Oxford University COVID-19 Research Response | 0009109 |
NIHR Biomedical Research Centre at Imperial College London | ISBRC-1215-20013 |