scibert icon indicating copy to clipboard operation
scibert copied to clipboard

A BERT model for scientific text.

Results 62 scibert issues
Sort by recently updated
recently updated
newest added

Hi, loading the model using ```python from transformers import AutoModel model = AutoModel.from_pretrained('path/to/scibert_scivocab_uncased/directory') ``` doesn't work appropriately. When starting to train the fine-tuned model, the following log shows up: ```log...

I am trying to rebuild the model using tf.official like so: ``` import tensorflow as tf import official import json config_dict = json.loads(tf.io.gfile.GFile('/content/drive/MyDrive/TREC-COVID/scibert_scivocab_uncased/bert_config.json').read()) bert_config = official.nlp.bert.configs.BertConfig.from_dict(config_dict) bert_model = official.nlp.bert.bert_models.get_transformer_encoder(bert_config) checkpoint...

I know that SciBERT is pre-trained by the Semantic Scholar corpus. I also know that the Semantic Scholar corpus is not publicly available. I am wondering how many new papers...

It seems that SciBERT only supports Huggingface with PyTorch backend only, but not Tensorflow backend. Can you also provide the SciBERT support for Huggingface Tensorflow backend, or is there any...

Hi:) I was using the `scibert_scivocab_cased` model on Huggingface library, and I've found out that `AutoTokenizer` can't set `do_lower_case` option as `False` automatically. ```python >>> from transformers import AutoTokenizer >>>...

I'm a bit puzzled by something I encountered trying to encode sentences as embeddings. When I ran the sentences through the model one at a time, I got slightly different...

Hi, I am trying to estimate the carbon footprint of SciBERT based on this [Google paper](https://arxiv.org/pdf/2104.10350.pdf), and many parameters actually depend on which Google data center the used TPUs come...

Hello, I'm currently in midst of replicating the results for the relation extraction task on the ChemProt dataset using SciBERT but so far have been unsuccessful in achieving the F1...

Hi! I noticed that the `basevocab` SciBERT models aren't in the HuggingFace hub ( https://huggingface.co/allenai )---would it be possible to add them? Thanks!

I have trained the NER model on sciie dataset using the following config: ``` DATASET='sciie' TASK='ner' with_finetuning='_finetune' #'_finetune' # or '' for not fine tuning dataset_size=38124 export BERT_VOCAB=/home/tomaz/neo4j/scibert/model/vocab.txt export BERT_WEIGHTS=/home/tomaz/neo4j/scibert/model/weights.tar.gz...