E2EMERN
E2EMERN copied to clipboard
The source code for ACL 2021 paper
E2EMERN
[Title] An End-to-End Progressive Multi-Task Learning Framework for Medical Named Entity Recognition and Normalization
[Authors] Baohang Zhou, Xiangrui Cai, Ying Zhang, Xiaojie Yuan
Preparation
- Clone the repo to your local.
- Download Python version: 3.6.5.
- Download the pre-trained Bio-BERT models from this link. We use the BioBERT-Large in our experiments.
- Open the shell or cmd in this repo folder. Run this command to install necessary packages.
pip install -r requirements.txt
Experiments
- For Linux systems, we have shell scripts to run the training procedures. You can run the following command:
./train.ncbi.sh or ./train.bc5cdr.sh
- You can also input the following command to train the model. There are different choices for some parameters shown in square brackets. The meaning of these parameters are shown in the following tables.
Parameters | Value | Description |
---|---|---|
epoch | int | Training times |
LAMBDA | float | hyper-parameter in loss function |
MU | float | hyper-parameter in loss function |
bert_path | str | folder path of pre-trained BERT model |
save_pred_result | bool | save the prediction result |
python main.py \
--seed 11 \
--epoch 12 \
--LAMBDA 0.125 \
--MU 0.1 \
--dataset [ncbi, cdr] \
--bert_path ./biobert_large \
--save_pred_result \
-
After training the model, the test result is saved in the "results" folder. And the weights of the model are saved in the "weights" folder.
-
We also provide the weights of the model to reimplement the results in our paper. You can download the weights file (the extraction code 1234) and put them into the "weights" folder. Then run the following command:
./eval.ncbi.sh or ./eval.bc5cdr.sh
Bibtex:
@inproceedings{zhou-etal-2021-end,
title = "An End-to-End Progressive Multi-Task Learning Framework for Medical Named Entity Recognition and Normalization",
author = "Zhou, Baohang and
Cai, Xiangrui and
Zhang, Ying and
Yuan, Xiaojie",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-long.485",
pages = "6214--6224",
}