Lakshmi Panguluri
Results
1
comments of
Lakshmi Panguluri
Same issue with DPO: max_seq_length = 1024 # Supports automatic RoPE Scaling, so choose any number. compute_dtype = getattr(torch, "float16") # Load model model, tokenizer = FastLanguageModel.from_pretrained( model_name = "unsloth/mistral-7b-v0.3",...