tristan279
tristan279
i finetuned miniCPM-v-2_6 with this dataset format: `"id":str(i), "image":{"":f"path_upper","":f"path_lower"}, "conversations": [{"role": "user", "content": f"{prompt}"},{"role":"assistant","content":text}]` how do i invoke the model with the exact same prompt, using transformers library? prompt format...
a few days ago, the exact same loading code worked properly, now getting some kind of tokenizer issue `Using base model: meta-llama/Meta-Llama-3.1-8B-Instruct ==((====))== Unsloth 2024.12.2: Fast Llama patching. Transformers:4.45.2. \\...