Dynamic cache resize
In the ChatSampler, it would be good if users could resize the cache, like:
sampler = gm.text.ChatSampler(cache_length=1024)
sampler.resize_cache(2048)
On Colab when reaching the full cache length, it would allow to still run more prompts without having to restart from scratch
If this isn't resolved can I work on this ?
Hey @Conchylicultor, Ive noticed that the docs folders are empty as of now would it help if I coded out example notebooks for the Finetuning and LORA sections Im really keen on contributing didn't know where else to contact you, thanks!
I’d like to work on this issue, @Conchylicultor . Could you assign it to me?
Before proceeding, I have a question: When changing the cache length, do we need to preserve the last state of the chat, or can we simply reinstantiate the sampler to handle it?
Hi @Conchylicultor Adding an auto_resize option that grows the cache when it’s full could make things more convenient for the users. Happy to help if you'd like!
Hi @staru09 , @amritanshuvivek , @theprashasst , @Vidhu-sri ,
Thank you so much to all of you for your interest to contribute to the Gemma open source models. I'm really glad for all your enthusiasm for contribution. The projects related to Gemma models are open source and you can keep contribute to enhance the models to next level.
However, for the above issue there is PR currently opened. You all are welcome to contribute to Gemma models, if you feel anything that needs to improved, please feel free to open PR. Your continues interest is really appreciated.
Thanks.