Tal
Tal
well, guess it is a bot
@MarkRx This PR cannot be merged as is. These are the needed changes: (1) Remove any modification whatsoever to the `context`, or things related to it. It should not be...
@MarkRx I will not approve doing these changes in the default mode, without an `if` that turns them on if a flag is on. They are also too versboe, and...
@MarkRx - This PR is still too intrusive - you are editing every tool, just for a specific option that won't be used by 99.9% of pr-agent users. It's not...
looks good And from now on can easily edit `add_litellm_callbacks` function and update its content whenever you think it necessary
Your are welcome to change Models do *support* large context, but the quality may degrade. We limit the number of tokens (and enable multiple calls) trying to increase **quality** But...
this is what we support: https://pr-agent-docs.codium.ai/usage-guide/changing_a_model/
models need to be **hosted** just stating local model weights is not supported (anywhere). its not real way to deploy. CodeLlama-34b can be deployed locally via ollama, for examples https://ollama.com/library...
ok. you are welcome to open a PR to update the documentation with this trick if you think its necessary https://pr-agent-docs.codium.ai/installation/azure/#azure-devops-webhook
@taisyalobanov @okotek