LibreChat
LibreChat copied to clipboard
Enhancement: Select Vision Model from Client or Config file for Custom Endpoint
What happened?
Hello everyone,
I have connected the gemini-pro-vision model via openrouter.ai, but I always get the following error message within LibreChat. I've tested it with different images and types (png, jpg...).
Something went wrong. Here's the specific error message we encountered: Error: { "error": { "code": 400, "message": "Provided image is not valid.", "status": "INVALID_ARGUMENT" } }
Am I doing something wrong, do I need to set an option?
Thanks for your help!
Steps to Reproduce
- Create openrouter.ai account
- Get API key
- Edit librechat.yaml
- restart docker container
- Test gemini-vision-pro
What browsers are you seeing the problem on?
Firefox
Relevant log output
2024-01-25 14:17:52 error: [MeiliMongooseModel.findOneAndUpdate] Convo not found in MeiliSearch and will index b716275d-6e10-4a5b-a4ff-8a5a7a7b20d0 Document `b716275d-6e10-4a5b-a4ff-8a5a7a7b20d0` not found.
2024-01-25 14:17:54 warn: [OpenAIClient.chatCompletion][stream] API error
2024-01-25 14:17:54 warn: [OpenAIClient.chatCompletion][finalChatCompletion] API error
2024-01-25 14:17:54 error: [OpenAIClient.chatCompletion] Unhandled error type Error: {
"error": {
"code": 400,
"message": "Provided image is not valid.",
"status": "INVALID_ARGUMENT"
}
}
2024-01-25 14:17:54 error: [handleAbortError] AI response error; aborting request: Error: {
"error": {
"code": 400,
"message": "Provided image is not valid.",
"status": "INVALID_ARGUMENT"
}
}
Screenshots
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
Thanks for your report. I will have to test to be sure but this is likely because gpt-4-vision is being prioritized regardless of you selecting gemini as it is using OpenAI specs, on top of maybe some other incompatibility.
I'm using this issue to address the core issue, where users would benefit from outright selecting the vision model to be used.
For now, I also recommend using the Google endpoint as vision is fully supported for Gemini there. Maybe you are region-locked but you could use a VPN to access it. https://docs.librechat.ai/install/configuration/ai_setup.html#generative-language-api-gemini
As a workaround I'm using the Google endpoint with VPN. There it's working.