Support for Gemma 7B
Hi!
Are you considering adding support for Gemma 7B? It seems that it would be a great addition to the set of available models.
We're definitely interested in adding more models!
As I understand it, the 2B and 7B architectures are roughly the same (just different sizes for the parameters). If you'd be interested in adding the model yourself, we'd gladly stamp it :) It should be fairly simple, just adding:
- A new
gemma_7bfunction in thegemma/_model_builders.pyfile with the appropriate sizes - A screenshot or attached W&B log showing that the model learns from a simple alpaca fine-tuning
- Updating our models docs to show we now support it!
LMK if you have any questions or if this is something you'd like to take on.
I am on it!
I only now noticed that I didn't close this after merging. Thanks guys! You're doing a great job with this project!