anything-llm
anything-llm copied to clipboard
New Feature: Adding watsonx.ai LLM Platform support
Pull Request Type
- [x] ✨ feat
- [ ] 🐛 fix
- [ ] ♻️ refactor
- [ ] 💄 style
- [ ] 🔨 chore
- [ ] 📝 docs
What is in this change?
Adding watsonx.ai support as LLM Platform
Additional Information
New LLM backend supporing various LLMs is added. Running watsonx.ai will require:
- WATSONX_AI_ENDPOINT
- WATSONX_AI_APIKEY
- WATSONX_AI_PROJECT_ID
- WATSONX_AI_MODEL (i.e meta-llama/llama-2-70b-chat)
- WATSONX_EMBEDDING_MODEL_PREF (i.e. baai/bge-large-en-v1) In addition, we are intorducing AI Guardrails for input and LLM output
Developer Validations
- [ x] I ran
yarn lint
from the root of the repo & committed changes - [ x] Relevant documentation has been updated
- [ x] I have tested my code functionality
- [x ] Docker build succeeds locally