Musab Gultekin
Musab Gultekin
Hey, sorry because I'm a full time engineer now, I don't have too much time to maintain this library full-time. Sorry for inconvenience
Okay I was banging my head to the walls for this issue today. I used this component `torchtune.datasets.chat_dataset` . But it was requiring `chat_format` positional argument. So I had to...
I think we should also document the local data file usage as a dataset. Cause it took me half an hour to configure properly. This project is extremely promising. Keep...
The new doc page looks really great thank you! Its much more clear now. It's also incredible that `chat_format` argument issue fixed. Thanks @ebsmothers !
Thats set for llama2. We probably need to add a conf something like stop_tokens (Since llama3 instruct have two).