Teknium
Teknium
> @teknium1 how would one do so? I am a bit of a newbie here, is the process easy? Else I may find help elsewhere. That's why I asked because...
I'm hearing you can just raise the max sequence length and fine tune it on longer prompts
> Is there any parameter that needs to be optimized for the maximum length? It should just be that the training data has not seen a longer one, so the...
Any chance we could publish binaries for windows?
Would the dataset benefit from multiple prompt:response chains rather than just single prompt>response? i.e. Question:Answer:FollowupQ:FollowupA
for prompts it seems a good idea to keep typos
I see the same but fixing the capitalization didnt fix for me 
Am using transformers 4.27.1, is it a different version?
Yeah you have to install from Transformers github. I had thought since it was merged it was in an updated pip package but its not yet. `pip install git+https://github.com/huggingface/transformers.git` works...
> Maybe tangentially related, but @tloen curious why you might want to leave typos in the dataset (per [#32 (comment)](https://github.com/tloen/alpaca-lora/pull/32#issuecomment-1474454667)) Not my place to respond, but I would say leaving...