Kamal Raj Kanakarajan
Kamal Raj Kanakarajan
Hi @loretoparisi, I didn't test the C++ inference on macOS. I will try running it with the above modification you suggested. https://medium.com/@m.muizzsuddin_25037/error-ld-symbol-not-found-for-architecture-x86-64-a5e5b648ffc
@loretoparisi , which version is libtorch are you using? I am using 1.2.0
@loretoparisi, Try using https://download.pytorch.org/libtorch/cpu/libtorch-macos-1.2.0.zip
https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad You can download the model from the above link. You will have to rename the `config.json` to `bert_config.json`
Using ```python model =QA('bert-large-uncased-whole-word-masking-finetuned-squad') ``` Will also work.
@soumya997 The model will be downloaded and cached. The model downloaded only during the first run. The second run onwards it loads from the cache.
@soumya997 Pass the `ckpt_path` to TrainerConfig. The model will be saved after each epoch.
Hi @jasmoonli, You should be able to use this code to fine-tune the ELECTRA model with any dataset in the CoNLL-2003 format. https://github.com/huggingface/transformers/tree/master/examples/pytorch/token-classification
For doing similarity tasks it is better to use sentence transformers. You can fine-tune BioElectra or any transformer model from huggingface from https://www.sbert.net/
Error: in _convert_index if index[pos] is not None: IndexError: list index out of range