Nguyen Pham
Nguyen Pham
Full error attached ``` Traceback (most recent call last): File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/flask/app.py", line 2525, in wsgi_app response = self.full_dispatch_request() File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/flask/app.py", line 1822, in full_dispatch_request rv = self.handle_user_exception(e) File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/flask/app.py", line...
> I had the same error, but after I updated the transformers library to the latest version, the error went away. Please try installing transformers 4.25.1. This works to me
Same question. @eran-sefirot which v2 model are you deploying?
Same here, do we have any hardware requirement specs for x4 upscaler ?
Last time I got this issue, I fixed by upgrading transformers' lib `pip install --upgrade git+https://github.com/huggingface/diffusers.git transformers accelerate scipy`
But I just remove that line in v2-inference.yaml and it works `use_linear_in_transformer: True`
I have the same question, how to make sure pipeline generating thing in threadsafe ?