`small_model_filename` cannot be assigned during fine-tuning
The below code
https://github.com/anarchy-ai/LLM-VM/blob/7a5877b6adb2a71892203a71ac2095a01357bb1b/src/llm_vm/completion/optimize.py#L261
tries to extract small_model_filename from kwargs which is passed down from client.complete(...) -> optimizer.complete(...) -> optimizer.complete_delay_train(...)
I think the main purpose of this is to make sure we can save finetuned model to any location via llm_vm.client, but it will cause error because the same kwargs with small_model_filename will be passed to chat_gpt model:
https://github.com/anarchy-ai/LLM-VM/blob/7a5877b6adb2a71892203a71ac2095a01357bb1b/src/llm_vm/completion/optimize.py#L254
Resulting in
openai.error.InvalidRequestError: Unrecognized request argument supplied: small_model_filename
The apt name would be small_model_filepath instead of small_model_filename.