NBCE
NBCE copied to clipboard
Real-life use
I've tested this approach on a single-language (English) LlaMA, and it worked great, except:
- it didn't get the LinkedIn layoff answer right
- it didn't output any spaces between words
But the thing that I wonder about is real-life use: when you address a question to an LLM, you don't normally provide the context as well. Is there a way to provide it anyway? Also, is there any specific finetuning procedure that'd make the model better at using this approach?
You can use the nearby text as a query and divide the distant text into multiple shorter contexts through a sliding window.