feature(dspy): added compatibility with ANY llamaindex LLM
hello and thank you for the lovely tool
I've been struggling to get it to work with different LLM providers like Anthropic or some unorthodox LLMs that need to use the huggingface transformers modules, and I saw that dspy implements most of these but not all of them, while llama-index already has everything I've ever needed implemented.
this code implements a wrapper around the llama_index library to emulate a dspy llm
this allows the llama_index library to be used in the dspy framework since dspy has limited support for LLMs
This code is a slightly modified copy of dspy/dsp/modules/azure_openai.py
The way this works is simply by creating a dummy openai client that wraps around any llama_index LLM object and implements llm.complete and llm.chat
tested with python 3.12
dspy==0.1.4 dspy-ai==2.4.9 llama-index==0.10.35 llama-index-llms-openai==0.1.18
@jerryjliu I'm sure this could use a little polishing but what's your thought on this
Hi @FarisHijazi , left some comments here. I mainly want to clarify whether this PR supports all LLM integrations with LlamaIndex as specified or only limited to OpenAI? seems like one could just use DSPy.OpenAI instead?
Additionally, can you add the comments and example code to documentation for this LM as done with other LMs in our documentation here?
this is dope passing it over internally to help take a look very soon!
Hi @FarisHijazi , left some comments here. I mainly want to clarify whether this PR supports all LLM integrations with LlamaIndex as specified or only limited to OpenAI? seems like one could just use DSPy.OpenAI instead?
Additionally, can you add the comments and example code to documentation for this LM as done with other LMs in our documentation here?
I've responded to this in one of the comments: supports all llamaindex llms not just openai
I'll get onto addressing the comments within a few days hopefully, mind the messy code it was a quick hack, but now that i have your initial approvals I can invest time in polishing (also waiting for @jerryjliu 's feedback before I start coding)
hey guys anything happened to this? I just wanna know if I should work on it or if it's no longer needed/already implemented @jerryjliu @arnavsinghvi11
@okhat is the reason this was closed because the feature is no longer needed or because it went stale? I'm still happy to work on it, just need a bit of direction
@okhat @jerryjliu poke
Anything i can do to make this happen?