TinyLLaVA_Factory icon indicating copy to clipboard operation
TinyLLaVA_Factory copied to clipboard

template differences?

Open TuuSiwei opened this issue 1 year ago • 2 comments

Are there any differences in the _make_masks function across different LLM models? Don't they all compute loss only for the response part? What causes the variations among them?

TuuSiwei avatar Jun 11 '24 13:06 TuuSiwei

Different models use different tokenizers, and when different tokenizers tokenize the text, the corresponding label positions are different.

jiajunlong avatar Jun 12 '24 04:06 jiajunlong

@jiajunlong Hi. Does llama2, tinyllama and vicuna share the same template? Sorry to bother you.

exiawsh avatar Jul 27 '24 04:07 exiawsh