Ben Firshman
Ben Firshman
@hangtwenty sorry this got stuck. We'd still love to include this if you ever find the time again for a slimmed down PR. Hope you're well. :)
Great idea! There is some prior art here where we tried running Cog in a container for the integration tests, but it expects the prediction server to be on `localhost`...
For clarity, #649 was a subtly different thing: the need there was to run predict multiple times without having to call `setup()` again. The need expressed in this issue is...
Super helpful data -- thanks @nicholascelestin. I wonder if there was a way to keep a Cog model running in the background and running `cog predict` against that would be...
Oh yeah an interactive mode is a great idea. #666 😈 Agreed a web UI would be much better, but it's big and complicated. Trying to figure out if we...
This is super helpful information, thanks @halr9000. Do you need to run this locally, or would it be helpful if there was some way of running Cog models in the...
Some data from @nicholascelestin on how to run it in WSL: https://canary.discord.com/channels/775512803439280149/852636181492793344/989652053397733376
Some work on this: https://github.com/replicate/cog/pull/681
This is fantastic, thank you @palp. We also intend to support websockets on Replicate and we want to have the same API as Cog. So, just as a reminder to...
I wonder if there's something clever we can do with file handles, in a way that's also not confusing and memory inefficient...