AttributeError: 'ChatGPTLLMPredictor' object has no attribute '_llm
Hi! I get error: "AttributeError: 'ChatGPTLLMPredictor' object has no attribute '_llm". What is the problem
My test code:
# My OpenAI Key
import os
os.environ['OPENAI_API_KEY'] = ""
from llama_index import GPTSimpleVectorIndex, SimpleDirectoryReader, LLMPredictor, GPTTreeIndex
from langchain import OpenAI
from IPython.display import Markdown, display
from llama_index.langchain_helpers.chatgpt import ChatGPTLLMPredictor
# define LLM
llm_predictor = ChatGPTLLMPredictor(prepend_messages = [{"role": "system", "content": "You are a lawyer."}],
model_name="gpt-3.5-turbo",
# temperature=0,
max_tokens=4096)
documents = SimpleDirectoryReader('/content/test').load_data()
index = GPTSimpleVectorIndex(documents, llm_predictor=llm_predictor)
# save to disk
index.save_to_disk('index.json')
# load from disk
index = GPTSimpleVectorIndex.load_from_disk('index.json', llm_predictor=llm_predictor)
index.query("Give me main idea")
I have same probrem.
What langchain version are you on? Try updating langchain to at least langchain==0.0.105
I've got the same problem and am on langchain==0.0.106.
Hi, sorry about this - will push out a fix soon!
In the meantime the "official" way of adding chatgpt is using langchain's LLM wrapper https://github.com/jerryjliu/llama_index/blob/main/examples/vector_indices/SimpleIndexDemo-ChatGPT.ipynb
The ChatGPTLLMPredictor is deprecated anyways
Hi @jerryjliu If the ChatGPTLLMPredictor will be deprecated, how can we custom the chatGPT prompt by using langchain's LLM wrapper? Could you give a simple demo to show? Thanks!
Hi @jerryjliu If the
ChatGPTLLMPredictorwill be deprecated, how can we custom the chatGPT prompt by usinglangchain's LLM wrapper? Could you give a simple demo to show? Thanks!
I have a same problem! @jerryjliu
Yes! Take a look at CHAT_REFINE_PROMPT as an example in gpt_index/prompts/chat_prompts.py. This is used by default when a chatgpt model is specified but this should give you a sense of how to define a custom refine prompt
@jerryjliu @madawei2699 @timurka Maybe you can take a look at chatgpt_refine_prompt examples:https://github.com/jerryjliu/llama_index/pull/733
Hey @jerryjliu thanks for providing the LLMPredictor example. I was wondering if it was possible to still provide the prepend_messages parameter, to provide more context, using LLMPredictor? I don't see an equivalent parameter as of now.
Thanks!