agpro
agpro
> You might want to wait. I think I'm still dragging more changes out of the huggingface/meta guys. So frustrating jesus christ , how long do they need to get...
they already know about it on reddit but I guess not many pay attention to it and they use what's already available On Sat, 11 May 2024 at 21:37, Old...
problem solved ``` huggingface-cli download \ NikolayKozloff/Meta-Llama-3-8B-Instruct-bf16-correct-pre-tokenizer-and-EOS-token-Q8_0-Q6_k-Q4_K_M-GGUF \ Meta-Llama-3-8B-Instruct-correct-pre-tokenizer-and-EOS-token-Q8_0.gguf]GGUF \ --local-dir downloads \ --local-dir-use-symlinks False ``` create he modelfile file and paste into it : `FROM ./downloads/Meta-Llama-3-8B-Instruct-correct-pre-tokenizer-and-EOS-token-Q8_0.gguf` then in terminal...
+1 to this
> From what I see you need train models: first prior model that you will use in unclip model training. So far I'm in process of training own model so...
gotcha ,thank you! atm im scraping the web for MJ images oil painting and impasto images then will start the main training. util then need to test with just a...
dude, it works, i was able to train with 13 images and was so damn quick for both models. the outputs are very very close to training date ... this...
not yet no, i only tried and tested it for style, ill try tomorrow for a person. but I guess same idea from stable diffusion applies here 2 when it...
> I'm here to write words of support, I am interested in exploring what IBM + OLLAMA can do +1 to this