breadbrowser
breadbrowser
> > @scf4 Public figures working in ML like the authors of Riffusion should stick to royalty free datasets. The strategy for AI music will be to make fine-tuning really...
you are trying to use 20 gigs from a 4gig GPU
> > you are trying to use 20 gigs from a 4gig GPU > > Please explain in more detail how it should be fixed (or guide link) (i not...
> Stable Diffusion model is very large and needs a GPU with a lot of VRAM.. .. There are version of SD that works with less vram but I'm not...
> See #31, fine-tuning is very easy in Huggingface Transformers, see [this comment](https://github.com/microsoft/BioGPT/issues/31#issuecomment-1422192114) for all details I have already seen that. it was no help to me.
please just use this https://huggingface.co/spaces/stabilityai/stable-diffusion. not a lot of people have a fucking 11/12GB VRAM gpu.
> There's more options than that. This repo https://github.com/basujindal/stable-diffusion has an optimized version that uses less VRAM but takes longer. Apparently runs on 4GB, but I haven't tested it myself....
ai will never be perfect. Great examples are airplanes and cars and every text-to-image model not knowing what a counter-ram is. I have disproved your thought.
model 7b ``` The following is a conversation with a gigachad. /n Human: Hello, who are you? /n gigachad: 'i'm gigachad aka chad thundercock. /n Human: how did you find...
> Thank you for that response! > Some companies, like Gigabyte, now [support 48GB DDR5 modules](https://www.gigabyte.com/Press/News/2064) on their LGA1700 models. Crucial currently has 192GB DDR5 7000mhz for $700. So it...