Mister K
Mister K
``` 0: 640x448 1 face, 15.9ms Speed: 2.8ms preprocess, 15.9ms inference, 1.4ms postprocess per image at shape (1, 3, 640, 448) Using pytorch attention in VAE Using pytorch attention in...
mmmmm, i git clone comfyui into another folder instead... and check if it runs... which it did... so it's not related to any other dependencies. but when i ran micro...
  but... since i technically have a linux machine now, i can technically get flash attention 2 running. even though it is not supported in...
 okay. so first issue that i encounter is the following: ``` [START] Security scan DEPRECATION: Loading egg at /home/ks/comfy/venv/lib/python3.12/site-packages/flash_attn-2.6.3-py3.12-linux-x86_64.egg is deprecated. pip 24.3 will enforce this behaviour...
from the rest of the startup process... i can't actually tell if there's any difference... i just wish i can figure out how to get rid of all the warnings......
 Okay i am in the process of installing flash attention on my windows 11 pc ``` ddd insightface ven Sat Oct 26 17:29:25 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 560.35.03 Driver...
 the other window is still installing flash-attn as i type this. let's see if it makes any difference once it is installed... i am assuming once it is installed...
 You are kidding me right? that's all i got to do and now my image generation based on my existing workflows... wiithout any other special modification... from the normal...
``` ddd insightface ven Sat Oct 26 19:08:12 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 560.35.03 Driver Version: 565.90 CUDA Version: 12.7 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile...
https://huggingface.co/microsoft/Phi-3-mini-4k-instruct from MS's hugging face it mentioned briefly. **_flash_attn==2.5.8 _** below i have updated to show my current flash-attn built from source after installing via pip. ``` How to Use...