loopgpt icon indicating copy to clipboard operation
loopgpt copied to clipboard

Support chunking files

Open krisquigley opened this issue 1 year ago • 3 comments

Fantastic work so far guys!

I'm currently trying to pass it a template for document generation but I am getting the following response:

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 59431 tokens. Please reduce the length of the messages.

Do you plan on supporting longer files? If so, I can have a go at creating a PR if you could point me in the right direction? (I mostly use JS and Ruby, my Python is a bit rusty)

krisquigley avatar Apr 19 '23 17:04 krisquigley

Hey @krisquigley, thanks for raising the issue. We're looking into this, can you tell us more about how you got this response so we can reproduce it on our end?

FayazRahman avatar Apr 19 '23 19:04 FayazRahman

Sure thing, @FayazRahman. Thanks for the quick response!

  1. Set up an agent to generate a document based on a template:
agent.name = "RubyContractorGPT"
agent.description = (
    "an AI assistant that produces contracts for Ruby Developers from a template"
)
agent.goals = [
    "Write a contract for a Ruby Developer",
]
  1. At some point it will want to read from a file, or give it feedback to read from a file immediately: read_from_file, Args: {'file': './training-data/short-form-single-company.rtf'}

  2. Ensure the file is over 4097 tokens and you should see an error along the lines of:

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 59431 tokens. Please reduce the length of the messages.

krisquigley avatar Apr 19 '23 19:04 krisquigley

Thanks for the update @krisquigley! We will implement chunking soon.

FayazRahman avatar Apr 20 '23 19:04 FayazRahman