LLM-VM icon indicating copy to clipboard operation
LLM-VM copied to clipboard

`small_model_filename` cannot be assigned during fine-tuning

Open rozeappletree opened this issue 2 years ago • 1 comments

The below code

https://github.com/anarchy-ai/LLM-VM/blob/7a5877b6adb2a71892203a71ac2095a01357bb1b/src/llm_vm/completion/optimize.py#L261

tries to extract small_model_filename from kwargs which is passed down from client.complete(...) -> optimizer.complete(...) -> optimizer.complete_delay_train(...)

I think the main purpose of this is to make sure we can save finetuned model to any location via llm_vm.client, but it will cause error because the same kwargs with small_model_filename will be passed to chat_gpt model:

https://github.com/anarchy-ai/LLM-VM/blob/7a5877b6adb2a71892203a71ac2095a01357bb1b/src/llm_vm/completion/optimize.py#L254

Resulting in

openai.error.InvalidRequestError: Unrecognized request argument supplied: small_model_filename

rozeappletree avatar Aug 25 '23 00:08 rozeappletree

The apt name would be small_model_filepath instead of small_model_filename.

rozeappletree avatar Aug 25 '23 00:08 rozeappletree