llama3 8b instruct model, how to use this format params, can you share? Need a example or prompt related documentation.
Thanks a ton sir! I will check this.
Same prompt and same ocr text from image. Each request the llm gives different results, how can I maintain the results. Is there any options for this, I understand this...
Getting same result as before inspite of using prompt = pipeline.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True, format = "JSON" )
Thanks sir, Is there any options to identify which keys are present in particular images. Say, I had a resume, the resume converted as fax documents. The resume contains my...
 Please look into this document, for ex, patient name is key and pamela wood is a answer
Yes i am using this official paddleocr model for english. Now that is a default model, i can't finetune the model. Can you please share some ideas or anything about...
> Having the same problem. Any update on this? Or any prompt hint? You need to explicitly mention you JSON Structure in the prompt. Its the only way to get...
Thanks a lot for the response William
> > @davedgd Oh so Unsloth is fine (the models or just finetuning with Unsloth?) but the Meta ones still don't work as expected? > > Correct, but to clarify,...