Laurel Orr
Laurel Orr
I noticed this too. One work around is to call ``` special_tokens_mask = tokenizer.get_special_tokens_mask( input_ids.tolist(), already_has_special_tokens=True ) ``` The above worked for me while `return_special_tokens_mask` did not.
Thank you for letting me know and for providing an input. Let me poke around and see. Bootleg was trained on individual sentences --- geared more towards short text NED...
Hey. It would be possible for sure. Setting up some worker queue for the processing. It's not on my immediate roadmap to solve (as we have offline extract_mentions and dump_preds...
What if I wanted to finetune the 6B or 13B models? Huggingface is not optimized and without model parallelism, I'm not sure it would fit on a single GPU (even...
Hey. I haven't seen this error before. Did you finetune minotaur and are trying to load it from a saved checkpoint? My immediate guess is the checkpoint is in a...
Did you just try passing `NumbersStation/nsql-llama-2-7B` as the model_name_or_path?