dspy icon indicating copy to clipboard operation
dspy copied to clipboard

feature(dspy): added compatibility with ANY llamaindex LLM

Open FarisHijazi opened this issue 1 year ago • 4 comments

hello and thank you for the lovely tool

I've been struggling to get it to work with different LLM providers like Anthropic or some unorthodox LLMs that need to use the huggingface transformers modules, and I saw that dspy implements most of these but not all of them, while llama-index already has everything I've ever needed implemented.

this code implements a wrapper around the llama_index library to emulate a dspy llm

this allows the llama_index library to be used in the dspy framework since dspy has limited support for LLMs

This code is a slightly modified copy of dspy/dsp/modules/azure_openai.py

The way this works is simply by creating a dummy openai client that wraps around any llama_index LLM object and implements llm.complete and llm.chat

tested with python 3.12

dspy==0.1.4 dspy-ai==2.4.9 llama-index==0.10.35 llama-index-llms-openai==0.1.18

FarisHijazi avatar Jul 01 '24 10:07 FarisHijazi

@jerryjliu I'm sure this could use a little polishing but what's your thought on this

FarisHijazi avatar Jul 01 '24 10:07 FarisHijazi

Hi @FarisHijazi , left some comments here. I mainly want to clarify whether this PR supports all LLM integrations with LlamaIndex as specified or only limited to OpenAI? seems like one could just use DSPy.OpenAI instead?

Additionally, can you add the comments and example code to documentation for this LM as done with other LMs in our documentation here?

arnavsinghvi11 avatar Jul 08 '24 20:07 arnavsinghvi11

this is dope passing it over internally to help take a look very soon!

jerryjliu avatar Jul 08 '24 20:07 jerryjliu

Hi @FarisHijazi , left some comments here. I mainly want to clarify whether this PR supports all LLM integrations with LlamaIndex as specified or only limited to OpenAI? seems like one could just use DSPy.OpenAI instead?

Additionally, can you add the comments and example code to documentation for this LM as done with other LMs in our documentation here?

I've responded to this in one of the comments: supports all llamaindex llms not just openai

I'll get onto addressing the comments within a few days hopefully, mind the messy code it was a quick hack, but now that i have your initial approvals I can invest time in polishing (also waiting for @jerryjliu 's feedback before I start coding)

FarisHijazi avatar Jul 08 '24 21:07 FarisHijazi

hey guys anything happened to this? I just wanna know if I should work on it or if it's no longer needed/already implemented @jerryjliu @arnavsinghvi11

FarisHijazi avatar Oct 09 '24 07:10 FarisHijazi

@okhat is the reason this was closed because the feature is no longer needed or because it went stale? I'm still happy to work on it, just need a bit of direction

FarisHijazi avatar Jan 21 '25 17:01 FarisHijazi

@okhat @jerryjliu poke

Anything i can do to make this happen?

FarisHijazi avatar Feb 18 '25 21:02 FarisHijazi