araleza

Results 18 comments of araleza

Thanks for having looked into this. I've been suspicious of these `\n`'s in llama.cpp since I noticed that when I added \n\n for llama 3's prompt, the Continuation would usually...

I tried adding: ``` [...] and self.quantization_config is not None: ``` to the end of that line there (and similar additions in two other places that came up), and it...

Yeah, it finally 'loaded' but then it said some weights of the model checkpoint were not used when initializing LlamaForCausalLM, and it listed a giant list of weights, which I'm...

> For now, is it possible to use a non GPTQ quantized model? I don't know actually... I've only done LoRA training with oobabooga's Training tab, and it can only...

Sounds good. I think you've got two groups of people who want to use your software: 1) people who have a big model and big training data, and want the...

Can confirm what dream said. The default model file in the example code is: ``` "ckpt_name": "v1-5-pruned-emaonly.ckpt" ``` but you might not have the -emaonly version, and models tend to...

I tried --enable_wildcard out for multiline captions. Seems to work when I tried adding a second caption to most of my face images. And it works well for training on...

> hello again. what is the latest status of this? i want to today use this multi line feature. what do i need to do? write each line a different...