smartgpt
smartgpt copied to clipboard
add proxy configuration in yml
I hope to add proxy configuration in yml
Can you elaborate on what you mean by this?
I'll try to implement this after my new config work. Proxies can allow for higher security.
@ODomWang what type of protocols would you look for? Of course HTTP Proxy and SOCKS5 will be implemented, but I'm wondering what else you'd like to see.
That sounds good to me, ideally we should have some sort of global HTTP client settings that applies to all HTTP clients initialized in the project.
Because my region cannot directly access OpenAI and some APIs, I need to add a proxy in the HTTP request to facilitate network communication
Thanks for clarifying. We're planning to do a configuration overhaul soon, and we'll make sure that you can include a global proxy for requests during that update.
In the meantime, support for local LLMs is almost done and will be released shortly. Once added, if you choose to, you can try to load local models (although I can't guarantee results will be as good as GPT3.5)
Thank you for your patient response. I am looking forward to the next version now