geoffsmithBK

Results 4 comments of geoffsmithBK

Looks like llama.cpp's server is now supporting a fully native OAI api, exporting endpoints like /models, /v1/{completions, chat/completions}, etc. Essentially implementing the old 'simple-proxy-for-tavern' functionality and so can be used...

> These params won't be sent unless added as a new Chat Completion or Text Completion source. Got it. Seems straightforward, I gather the disinclination is the additional point of...

I've been getting this same error (in Upscale Latent By and UltimateSDUpscale) for a few weeks now running PyTorch nightlies (up to/including last night's 2.6.0.dev20241002). I looked back through my...

Thanks @jonasfabian, spent a good .5 hr. looking for where .view was being called from! Please make a PR to https://github.com/chaiNNer-org/spandrel