Mattt

Results 266 comments of Mattt

Hi @ElyasMoshirpanahi. Thanks for reporting. I can't think of anything that changed in Cog that could've caused that. Can you please share the command you ran and the full output?

@andreasjansson Do you recall what errors you saw? It'd be great to have a minimal example that reproduces this problem.

Digging into this some more, I found [this discussion](https://github.com/tiangolo/fastapi/discussions/9966) and [this issue](https://github.com/tiangolo/fastapi/issues/10360) about problems arising from FastAPI >= 0.100.0 attempting to load v2 Pydantic when the v1 shim is loaded....

I got tests passing locally (aside from hypothesis timeouts and flakes), but can't resolve failures in CI. ``` =========================================================== short test summary info ============================================================ FAILED python/tests/server/test_worker.py::test_fatalworkerexception_from_irrecoverable_failures[killed_in_predict-payloads1] - hypothesis.errors.DeadlineExceeded: Test took...

Possibly related to https://github.com/replicate/cog/pull/1621

Closing in favor of #1687. I think we're close, but if we can't manage to get that working, this is probably the next best alternative.

Hi @LDMFD. We frequently use GitHub Actions to release models with Cog as part of our CI/CD process. If you're having trouble authenticating, take a look at [replicate/setup-cog](https://github.com/replicate/setup-cog).

Hi @li-dennis. As you alluded to, the directory where Python libraries are installed isn't included in the `PATH`. If you want to do `cog run jupyter`, you'll need to install...

Hi @Mougatsu. Sorry for not responding sooner. Those example models don't work for me, either. Replicate has an official Cog model for VLLM here: https://github.com/replicate/cog-vllm. Please give that a try...

@bakermanbrian Thanks for sharing your suggestion. Do you have a specific use case in mind for a shutdown event?