llm
llm copied to clipboard
How do I use Huggingface tokenization to use a model on Huggingace in MODEL_PATH instead of my local machine?
I don't think i understand your question completely, but if you want to use an external tokenizer you can simply provide the path or model-name to the tokenizer you want to use. The tokenizers library will resolve it and if necessary download the tokenizer's config file.