mem0
mem0 copied to clipboard
Add support for Anyscale endpoints
🚀 The feature
Requested by a user on Discord:
- Discord link: https://discord.com/channels/1121119078191480945/1125758905310519327/1195257282821369867
- Anyscale blog: https://www.anyscale.com/blog/anyscale-endpoints-json-mode-function-calling-new-models-llama-guard-and-mistral-7b-openorca
Motivation, pitch
Anyscale is being used by many users now and we would like to support these users.
Thanks for creating this issue! Can support for pplx-api also be added?
I feel the implementation should be straight forward given both anyscale and pplx-api are OpenAI compatible.
Can't this be done through the Custom Endpoint Config ? I tried with this :
llm:
provider: huggingface
config:
endpoint: https://api.endpoints.anyscale.com/v1
model_kwargs:
model: mistralai/Mixtral-8x7B-Instruct-v0.1
Starting the app with :
app = App.from_config("anyscale.yaml")
Gives me the error:
Exception: Error occurred while validating the config. Error: Key 'llm' error:
Key 'config' error:
Wrong key 'model_kwargs' in {'endpoint': 'https://api.endpoints.anyscale.com/v1', 'model_kwargs': {'model': 'mistralai/Mixtral-8x7B-Instruct-v0.1'}}
The provider should be "openai" - since Anyscale offers OpenAI compatible endpoints