vae encode error,TypeError: 'Stream' object does not support the context manager protocol
start using --disable-smart-memory --async-offload
File "M:\ComfyUI\ComfyUI\execution.py", line 324, in get_output_data return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, hidden_inputs=hidden_inputs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\execution.py", line 298, in _async_map_node_over_list await process_inputs(input_dict, i) File "M:\ComfyUI\ComfyUI\execution.py", line 286, in process_inputs result = f(**inputs) ^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\nodes.py", line 343, in encode t = vae.encode(pixels[:,:,:,:3]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\sd.py", line 782, in encode out = self.first_stage_model.encode(pixels_in).to(self.output_device).float() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\ldm\wan\vae.py", line 480, in encode out = self.encoder( ^^^^^^^^^^^^^ File "M:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\ldm\wan\vae.py", line 289, in forward x = self.conv1(x, feat_cache[idx]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\ldm\wan\vae.py", line 40, in forward return super().forward(x) ^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\ops.py", line 214, in forward return self.forward_comfy_cast_weights(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\ops.py", line 206, in forward_comfy_cast_weights weight, bias, offload_stream = cast_bias_weight(self, input, offloadable=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\python_embeded\Lib\site-packages\torch_dynamo\eval_frame.py", line 745, in _fn return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\ops.py", line 101, in cast_bias_weight bias = comfy.model_management.cast_to(s.bias, bias_dtype, device, non_blocking=non_blocking, copy=has_function, stream=offload_stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "M:\ComfyUI\ComfyUI\comfy\model_management.py", line 1073, in cast_to with stream: ^^^^^^ TypeError: 'Stream' object does not support the context manager protocol
Steps to Reproduce
latest comfyui, wan vae encode a webp image
Debug Logs
TypeError: 'Stream' object does not support the context manager protocol
Could you check what version of pytorch + python you have? And if you can, post the results of doing a pip freeze while in the virtual environment.
If you have trouble doing the above, let me know and I can give some more specific steps!
pytorch 2.6, python 3.12
I also encountered the same issue. Restarting the server resolved it.
i have the same type error (not using the above mentioned flags, though). i'm using portable. whenever i start comfyui, i do execute the "update_comfyui.bat"-file. last time i used it this morning, then started comfyui, everything was normal. now, about 10 hours later in the evening, i again wanted to use comfyui, did the update and then started comfy. after that no generation whatsoever works, no image generation, no wan-generation. so i guess, no comfyui anymore for me then, heh. pytorch version: 2.4.0+cu121 Python version: 3.11.8 ComfyUI version: 0.3.75 ComfyUI frontend version: 1.32.9 xformers version: 0.0.27.post2
restarting the server does not help.
huh, it's working again. what i tried: downgrade ComfyUI frontend version: 1.32.9 to 1.30.6. now everything's working again.
It appears to depend on whether or not the model being executed is being offloaded. So it depends on settings, and probably current VRAM usage as well.
--disable-async-offload fixes it for me. It's supposedly an issue with the pytorch/cuda combo being too old.
I'm having the same issue as well, i'm trying to run wan animate native version. It was working yesterday, but after today's update (couple hours ago), it stopped working and had same message. I'm on cuda 13, python 3.12.7, and torch 2.9.1.
I used the --disable-async-offload flag like @BetaDoggo suggested and now everything works again.
Use this command to solve
python main.py --disable-async-offload
i second using the flag. downgrading the front-end led to comfyui complaining that the front-end version is outdated.
Here's a couple logs from Swarm users affected by this:
- https://paste.denizenscript.com/View/137641
pytorch version: 2.4.1+cu124(weirdly old)Python version: 3.11.9 - https://paste.denizenscript.com/View/137642
pytorch version: 2.6.0+cu126(old but shouldn't be fatal)Python version: 3.11.6