Victor Dibia
Victor Dibia
Thanks for flagging this. Will take a look asap!Sent from my iPhoneOn Oct 3, 2023, at 1:39 AM, Harwinder Singh ***@***.***> wrote:[victordibia/llmx] llmx.generators.text.hf_textgen.hftextgenerator() got multiple values for keyword argument 'provider'...
Happy to see how this is moving on when convenient.
Its a good idea! In general, I think we need to have a structured design/approach to supporting _any_ model client. Some of this might already have been discussed in a...
Following up here .. a user just asked for this. > Can you also make the running of Autogen Agent in the web also from directly from the code... Not...
@ekzhu , thanks for the explanation, makes sense. We will learn more from usage and refactor if needed. In the mean time, one final thing is to ensure we guide...
Hi @SingTeng , Ollama lets you spin up an openai compatible endpoint. You should be able to put that into autogen studio as a model and test it. https://ollama.com/blog/openai-compatibility
Good idea. Blends with how I have been thinking of this. - Implict llm based termination (similar to what we do with teams, orchestrators that emit a termination signal) -...
Thanks for the note! I actually agree with you! A lot of effort is being put into ensuring the focus is on core concepts with longevity as opposed to a...
Fixed in #6191
Thanks for posting @usag1e , Any chance you want to take a swing at a fix? I am happy to help along the way!