zairm21
Results
2
comments of
zairm21
Hi, I'm facing the same issue. I'm trying the src/inference_wizardlm.py inference code with "WizardLM/WizardLM-7B-V1.0" as the base model and "data/WizardLM_testset.jsonl" the input. {"id": 1, "instruction": "If a car travels 120...
> Hi I get wierd output when I try to invoke model.generate using inference script ,but same prompt gives an expected output when chat demo is used and also it...