Jacob Bao
Jacob Bao
It may also be necessary to provide a method to inject the definition of the tool as json string into the prompt template.
@tmc Can this PR be merged? There may be more code conflicts in the future if we wait too long.
This pull request was submitted a long time ago and has not been reviewed, so there may be some conflicts. It seems that it is difficult to fix them now.
I will try later.
I'm not very familiar with SAML, so I don't currently have support for it. I will do later.
Perhaps we can export the environment variable OPENAI_ BASE_ URL to support projects such as llama-cpp-python to support OSS LLMs ?
Here, OPENAI_ BASE_ URL works well. Just set this env, no need to modify code.
> > Here, OPENAI_ BASE_ URL works well. Just set this env, no need to modify code. > > What if one wants to run gpt4all etc? The majority of...
I want to add params `max_tokens`, `temperature ` and `memory size` to agent predict api. What do you think @homanp ? If ok, I will make PR soon.