text-generation-webui
text-generation-webui copied to clipboard
Option to inject current system time into context
Description
Idea obtained from #403. In chat mode, injecting system time before each of bot's reply (and the context) could potentially enable LLMs to gain time awareness, thus perform better in long-term dialogues. Although the memory of the model is currently largely restricted by the 2048 context size, it would still be an interesting (and easy-to-implement) feature. When implementing this feature, please make it optional and enabled by a button. When disabled, time should still be stored in the persistent json, but should be ignored when injecting the context when doing inference.
Additional Context Potential format of injected time: [2023/04/13, 14:45:59], BotName: asdfasdf asfdasdf [2023/04/13, 14:46:03], User: asdf asdf qer zxcv
I played around with this idea, but rather showing how long ago a message was sent while trying to keep the token count lower.
15m ago User: asdf
15m ago Bot: reply
50s ago User: test
...
now User: Hi
This works by using a message class that contains the created time, using humanize lib's natural delta function
to get a "time ago" that updates each time the history is requested for generation.
I haven't seen much in results for time awareness though, but that's likely because I'm using a 7B model most of the time. Asking "how long have we been talking" often gives random answers. Maybe fine tuning on data like that or current time would work better, maybe a mix of both.
I'm thinking of trying different formats, like how discord shows time.
message
message
-- 14th April 2023 --
message
-- Yesterday --
...
This issue has been closed due to inactivity for 30 days. If you believe it is still relevant, please leave a comment below.
I have implemented this using a grammar file and some fancy timestamp stuff, but either my code is wrong or she doesn't know how to do time conversions well...