unsloth
unsloth copied to clipboard
Use Alpaca template with Phi-3
Hello!
Is there a way to use the Alpaca template with Phi-3?
I'm struggling a bit to understand the documentation, I was wondering if anyone can help me understand how to use it here and define the proper mapping:
from unsloth.chat_templates import get_chat_template
tokenizer = get_chat_template(
tokenizer,
chat_template = "phi-3", # Supports zephyr, chatml, mistral, llama, alpaca, vicuna, vicuna_old, unsloth
mapping = {"role" : "from", "content" : "value", "user" : "human", "assistant" : "gpt"}, # ShareGPT style
)
def formatting_prompts_func(examples):
convos = examples["conversations"]
texts = [tokenizer.apply_chat_template(convo, tokenize = False, add_generation_prompt = False) for convo in convos]
return { "text" : texts, }
pass
from datasets import load_dataset
dataset = load_dataset("philschmid/guanaco-sharegpt-style", split = "train")
dataset = dataset.map(formatting_prompts_func, batched = True,)
Also, a few days ago the Phi-3 Colab notebook was different and it seemed to be using the alpaca template, why did it change?