H Lohaus
H Lohaus
Hey, do you have curl_cffi installed? If not, you can try installing it with "pip install curl_cffi -U".
You can only use the models that the provider supports. If you want to run a provider with its default model, you can pass g4f.models.default.
Instead of: ``` if not '/' in model: models = { 'dbrx-instruct': 'databricks/dbrx-instruct', } model = models.get(model, model) ``` you could use model_aliases in DeepInfra. Thank you
Please resolve conflicts. Thank you
Hey there! It seems like the free GPT model might not be accessible in your region. Have you tried using the Llama model? Let me know if that works for...
Hey, DeepInfra shut down their free service. You should try using the Llama provider instead.
Hey, try changing the port in your command. Make sure you quit the previous calls. Add --port=free_number to your command.
Hey, did you get an error message in your terminal? Are you able to see You.com get opens in the web terminal?
This error message indicates that the browser cannot be opened. Potential causes include an already open browser or insufficient memory. Please verify if you are using Docker or if there...
I fixed the uvloop issue.