[Feature]: add litellm embedding to langchain
The Feature
I noticed langchain has a litellm backend for chat but not for embeddings.
Motivation, pitch
I'm coding DocToolsLLM and would like to add support for mistralai instead of OpenAI. But I would preferably use litellm for maximum customizability for the users (changing the LLM model to use with a simple argument change)
Also, mistralai does not yet have embeddings support in langchain and I would prefer to use mistral via litellm than directly
Twitter / LinkedIn details
No response
Up: I would still love that
hi @thiswillbeyourgithub - thanks for being an active user / contributor to LiteLLM. I'd love to get on a call and learn how we can improve LiteLLM for you + prioritize your issues. What's the best email to send an invite to ?
Here's my cal if that's easier: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
My linkedin if you want to DM : https://www.linkedin.com/in/reffajnaahsi/
Here's my cal if that's easier: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Done
Any updates on this? @thiswillbeyourgithub / @ishaan-jaff
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
I ended up doing my own litellm embeddings class and it seems to be working fine: https://github.com/thiswillbeyourgithub/wdoc/blob/main/wdoc/utils/customs/litellm_embeddings.py
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.