NBCE icon indicating copy to clipboard operation
NBCE copied to clipboard

Real-life use

Open StrangeTcy opened this issue 2 years ago • 1 comments

I've tested this approach on a single-language (English) LlaMA, and it worked great, except:

  1. it didn't get the LinkedIn layoff answer right
  2. it didn't output any spaces between words

But the thing that I wonder about is real-life use: when you address a question to an LLM, you don't normally provide the context as well. Is there a way to provide it anyway? Also, is there any specific finetuning procedure that'd make the model better at using this approach?

StrangeTcy avatar Jul 14 '23 13:07 StrangeTcy

You can use the nearby text as a query and divide the distant text into multiple shorter contexts through a sliding window.

bojone avatar Aug 10 '23 03:08 bojone