openai-whatsapp-chatbot icon indicating copy to clipboard operation
openai-whatsapp-chatbot copied to clipboard

restarting chat with history

Open eladrave opened this issue 1 year ago • 0 comments

If you ever save the chat to DB, then after a while this becomes too big, and the amount of tokens will be too much, or it will just not work.

Another option is to use embedding.

I was researching it, and found https://llamahub.ai/ Which essentially takes data from a source, and creates an embedding data for it. Here is a simple snippet of what I did:

from llama_index import download_loader, GPTSimpleVectorIndex, Document, SimpleDirectoryReader
import os
from pathlib import Path
os.environ["OPENAI_API_KEY"] = 'MY OPEN AI KEY'

PDFReader = download_loader("PDFReader")

loader = PDFReader()
documents = loader.load_data(file=Path('/Users/eladrave/Downloads/datafile.pdf'))

index = GPTSimpleVectorIndex(documents)
index.save_to_disk("documents.json")

response = index.query("What are terms?")
print(response)

(This just reads a PDF)

We can save the chat.get_conversation() when a session "ends" (timeout etc) like this:

from llama_index import download_loader, GPTSimpleVectorIndex, Document, SimpleDirectoryReader
import os
from pathlib import Path
os.environ["OPENAI_API_KEY"] = 'MY OPEN AI KEY'

UnstructuredReader = download_loader("UnstructuredReader")

loader = UnstructuredReader()
documents = loader.load_data(file=Path('memfilewithconversation.txt'))
index = GPTSimpleVectorIndex(documents)

response = index.query("What are terms?")

And then when you send a prompt to chat, you use it something like:

I will ask you questions based on the following context: — Start of Context —

The_Response_from_above

— End of Context— My question is: “How much wood would a woodchuck chew?”

eladrave avatar Mar 12 '23 03:03 eladrave