Feat: EmbeddingModel for Backend
Is your feature request related to a problem? Please describe.
this is a follow-up to the backed work which added the ChatModel
this issue is to continue this backend work and add the EmbeddingModel similar to what is currently implemented in the TS framework (and also follows the Python ChatModel implementation). e.g.,
EmbeddingModel.from_name('watsonx:text-embedding-3-small')
WatsonXEmbeddingModel('text-embedding-3-small')
Providers
Make embedding for all providers listed in https://i-am-bee.github.io/beeai-framework/#/python/backend?id=supported-providers. Implementation for other providers should be done based on the LiteLLMEmbeddingModel.
See reference implementation for WatsonxEmbeddingModel and OllamaEmbeddingModel. Other providers should be implemented in a similar fashion (it is always best to check out how it was done for that chat version of the given provider).
To dos
- [x] Ollama @Tomas2D to review and merge #814
- [x] Watsonx @Tomas2D to review and merge #814
- [ ] Add remaining providers: @araujof to hand off to Yair
Blocking
- #771
Is there a reason this was unassigned from me? I have an open PR on this #814 waiting for review and merge? (cc @Tomas2D )
We got that merged. Yair and the team will add the remaining providers. I also updated the scope.
All core providers were implemented. We can close.