llama_index icon indicating copy to clipboard operation
llama_index copied to clipboard

AttributeError: 'ChatGPTLLMPredictor' object has no attribute '_llm

Open timurka opened this issue 3 years ago • 8 comments

Hi! I get error: "AttributeError: 'ChatGPTLLMPredictor' object has no attribute '_llm". What is the problem

My test code:

# My OpenAI Key
import os
os.environ['OPENAI_API_KEY'] = ""

from llama_index import GPTSimpleVectorIndex, SimpleDirectoryReader, LLMPredictor, GPTTreeIndex
from langchain import OpenAI
from IPython.display import Markdown, display
from llama_index.langchain_helpers.chatgpt import ChatGPTLLMPredictor

# define LLM 
llm_predictor = ChatGPTLLMPredictor(prepend_messages = [{"role": "system", "content": "You are a lawyer."}], 
                                    model_name="gpt-3.5-turbo",
                                  #  temperature=0,
                                    max_tokens=4096)

documents = SimpleDirectoryReader('/content/test').load_data()
index = GPTSimpleVectorIndex(documents, llm_predictor=llm_predictor)

# save to disk
index.save_to_disk('index.json')

# load from disk
index = GPTSimpleVectorIndex.load_from_disk('index.json', llm_predictor=llm_predictor)

index.query("Give me main idea")

timurka avatar Mar 12 '23 06:03 timurka

I have same probrem.

K-D-K-6 avatar Mar 12 '23 15:03 K-D-K-6

What langchain version are you on? Try updating langchain to at least langchain==0.0.105

Kav-K avatar Mar 13 '23 00:03 Kav-K

I've got the same problem and am on langchain==0.0.106.

handrew avatar Mar 13 '23 01:03 handrew

Hi, sorry about this - will push out a fix soon!

In the meantime the "official" way of adding chatgpt is using langchain's LLM wrapper https://github.com/jerryjliu/llama_index/blob/main/examples/vector_indices/SimpleIndexDemo-ChatGPT.ipynb

The ChatGPTLLMPredictor is deprecated anyways

jerryjliu avatar Mar 13 '23 06:03 jerryjliu

Hi @jerryjliu If the ChatGPTLLMPredictor will be deprecated, how can we custom the chatGPT prompt by using langchain's LLM wrapper? Could you give a simple demo to show? Thanks!

madawei2699 avatar Mar 13 '23 11:03 madawei2699

Hi @jerryjliu If the ChatGPTLLMPredictor will be deprecated, how can we custom the chatGPT prompt by using langchain's LLM wrapper? Could you give a simple demo to show? Thanks!

I have a same problem! @jerryjliu

WangRongsheng avatar Mar 14 '23 04:03 WangRongsheng

Yes! Take a look at CHAT_REFINE_PROMPT as an example in gpt_index/prompts/chat_prompts.py. This is used by default when a chatgpt model is specified but this should give you a sense of how to define a custom refine prompt

jerryjliu avatar Mar 14 '23 04:03 jerryjliu

@jerryjliu @madawei2699 @timurka Maybe you can take a look at chatgpt_refine_prompt examples:https://github.com/jerryjliu/llama_index/pull/733

WangRongsheng avatar Mar 14 '23 05:03 WangRongsheng

Hey @jerryjliu thanks for providing the LLMPredictor example. I was wondering if it was possible to still provide the prepend_messages parameter, to provide more context, using LLMPredictor? I don't see an equivalent parameter as of now.

Thanks!

ctle-vn avatar Mar 20 '23 00:03 ctle-vn