openai-cookbook
openai-cookbook copied to clipboard
Inference Hyperparameter Tuning
Is there any interest in tuning the inference hyperparameters such as model, prompt, max_tokens, n, temperature, and top_p? I've been studying how to get the best value of the inference by tuning these hyperparameters under a budget constraint. I'll be happy to contribute here if there is a common interest.
Link to a notebook example: https://github.com/microsoft/FLAML/blob/main/notebook/integrate_openai.ipynb Related research paper: https://arxiv.org/abs/2303.04673
Interesting!
I've been thinking of adding to a section on the main readme page that links to other tools & examples around the web. I think FLAML could be a good candidate. Would it make sense to you if I added a link there (but not a notebook to the repo)?
Interesting!
I've been thinking of adding to a section on the main readme page that links to other tools & examples around the web. I think FLAML could be a good candidate. Would it make sense to you if I added a link there (but not a notebook to the repo)?
Sure, that'll be great. We also have an example of tuning ChatGPT (both GPT-3.5 and GPT-4): https://github.com/microsoft/FLAML/blob/main/notebook/integrate_chatgpt.ipynb Other suggestions are appreciated as well.
@ted-at-openai The notebook links are updated: https://github.com/microsoft/FLAML/blob/main/notebook/autogen_openai.ipynb https://github.com/microsoft/FLAML/blob/main/notebook/autogen_chatgpt.ipynb And a documentation webpage is added: https://microsoft.github.io/FLAML/docs/Use-Cases/Auto-Generation
Thanks! Added link to the README in this PR: https://github.com/openai/openai-cookbook/pull/442
Thanks! Added link to the README in this PR: #442
Thank you! Please let me know if there is anything I can help with in future.