Adam Łucek

Results 3 comments of Adam Łucek

https://huggingface.co/AdamLucek/gemma2-2b-it-chinese-german Also found this to happen with model stock and gemma2 2b

Beat me to it, same thing is happening here with lm_head.weight for the 2b model looks like its likely something related to handling the tokenizer source

@piotr25691 Remove the entry for it from your index.json using whatever code editor, and then for the model itself you can directly edit the file with safetensors package. Here's a...