Ricky Loynd
Ricky Loynd
Some of the keys are invalid now with openai>=1.0. `request_timeout` should be removed. It's replaced by `timeout`. Same with `seed`, which was replaced by `cache_seed`. These details are from the...
Did you modify simple_chat.py in any way, other than llm_config? What's your openai version? openai==1.6.1 should work.
@ekzhu I'm wondering, why does simple_chat.py work for me, with no model specified, using the latest pyautogen==0.2.6? Oh I see, it's the model key in llm_config. I had that in...
@thisismygitrepo Instead of trying to pass the model to `get_config_list`, or dynamically adding it later, you just add a model item to your config_list like this: `config_list = [{"model": "gpt-4",...
Does your local LLM support function calling? @kevin666aa
@victordibia
@bitnom Thank you for the contribution! Can you give a couple examples of the types of changes that PEP-621 and PEP-517 will necessitate in our codebase?
> Only the changes already included in the PR Are you saying that this PR won't cause any other lines of code in the repo to be flagged or disallowed...
@BeibinLi fyi