unsloth
unsloth copied to clipboard
Resize embeddings, tokenizers - adding new tokens don't work
From Twitter - adding new tokens to Qwen don't work?
# Add special tokens to the tokenizer
num_added_tokens = tokenizer.add_special_tokens({"additional_special_tokens": special_tokens})
# Resize token embeddings of the model
model.resize_token_embeddings(len(tokenizer))