SciFive
SciFive copied to clipboard
hugging face models do not work
Hi, Thanks for your great contribution in biomedical domain. I tried all the models in the hugging face format and I couldn't replicate any of the results or even get a reasonable output. Is there something wrong with the code, model, or anything is missing?
I run the following code:
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("razent/SciFive-base-PMC")
model = AutoModelForSeq2SeqLM.from_pretrained("razent/SciFive-base-PMC")
model.to(device)
sentence = "Identification of APC2 , a homologue of the adenomatous polyposis coli tumour suppressor ."
text = "ncbi_ner: " + sentence + " </s>"
encoding = tokenizer.encode_plus(text, pad_to_max_length=True, return_tensors="pt")
input_ids, attention_masks = encoding["input_ids"].to(device), encoding["attention_mask"].to(device)
outputs = model.generate(
input_ids=input_ids, attention_mask=attention_masks,
max_length=256,
early_stopping=True
)
for output in outputs:
line = tokenizer.decode(output, skip_special_tokens=True, clean_up_tokenization_spaces=True)
print(line)
And this is the output:
ncbi_ner: ncbi_ner: ncbi_ner:
The expected output (based on the paper) should be as follow:
Identification of APC2 , a homologue of the entity* adenomatous polyposis coli tumour *entity suppressor .
I replaced the model with all other available large, base, pubmed, pmc, pubmed+pmc models (basically all 6 hugging face variations) but I didn't get any reasonable outputs.
Could you give me a solution?
Hi Sorrry for the late reply, the models on HuggingFace is just pretrained weights. We didn't share the fine-tuned models. you can fine-tuned it using this scrip with mesh tensorflow here https://github.com/justinphan3110/SciFive/blob/main/finetune/ner/scifive_fine_tune_ner.ipynb
Let me know if you still need the fine-tuned models, I can upload it into huggingface for you
Hi Justin, can you please upload a fine tuned model for biomedical QA (the one that outperformed BioBert in your paper)? It'd be great - Thanks!
Hi,
Thank you for publishing the models. I'm wondering if the current models on HuggingFace are still the pretrained weights as you said in March? I used the example code there (https://huggingface.co/razent/SciFive-base-Pubmed_PMC) but couldn't reproduce the same output shown in your paper. Given the input of "Identification of APC2 , a homologue of the adenomatous polyposis coli tumour suppressor .", I got an output of ', a novel tumour suppressor, in human colon cancer............................:::::::::. :::.::.'
If the current published models are not the fine-tuned ones, could you please published those fine-tuned models for NER?
Thank you in advance for your reply!