dandingsky
Results
1
comments of
dandingsky
Not sure if this is related with this issue, but I found that when applying lora to Llama-2, and include `target_modules=['lm_head', 'q_proj', 'v_proj']` in `LoraConfig`, the lora on `lm_head` will...