Mister K

Results 31 comments of Mister K

``` 0: 640x448 1 face, 15.9ms Speed: 2.8ms preprocess, 15.9ms inference, 1.4ms postprocess per image at shape (1, 3, 640, 448) Using pytorch attention in VAE Using pytorch attention in...

mmmmm, i git clone comfyui into another folder instead... and check if it runs... which it did... so it's not related to any other dependencies. but when i ran micro...

![Screenshot 2024-09-14 212805](https://github.com/user-attachments/assets/787cf0c0-bfbc-4fdc-96c2-6bf466321132) ![Screenshot 2024-09-14 213505](https://github.com/user-attachments/assets/f86ecdfb-5319-43ef-b12c-63528a764225) but... since i technically have a linux machine now, i can technically get flash attention 2 running. even though it is not supported in...

![Screenshot 2024-09-14 215737](https://github.com/user-attachments/assets/307030ca-477c-4db9-bc93-a26264359e4a) okay. so first issue that i encounter is the following: ``` [START] Security scan DEPRECATION: Loading egg at /home/ks/comfy/venv/lib/python3.12/site-packages/flash_attn-2.6.3-py3.12-linux-x86_64.egg is deprecated. pip 24.3 will enforce this behaviour...

from the rest of the startup process... i can't actually tell if there's any difference... i just wish i can figure out how to get rid of all the warnings......

![image](https://github.com/user-attachments/assets/39262011-7e8c-42f0-9e90-626724d777ec) Okay i am in the process of installing flash attention on my windows 11 pc ``` ddd insightface ven Sat Oct 26 17:29:25 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 560.35.03 Driver...

![image](https://github.com/user-attachments/assets/895711bd-85ce-4158-bb57-9a1fc3e2034c) the other window is still installing flash-attn as i type this. let's see if it makes any difference once it is installed... i am assuming once it is installed...

![image](https://github.com/user-attachments/assets/967d5bb7-2dda-4794-8226-b3a2e92b166b) You are kidding me right? that's all i got to do and now my image generation based on my existing workflows... wiithout any other special modification... from the normal...

``` ddd insightface ven Sat Oct 26 19:08:12 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 560.35.03 Driver Version: 565.90 CUDA Version: 12.7 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile...

https://huggingface.co/microsoft/Phi-3-mini-4k-instruct from MS's hugging face it mentioned briefly. **_flash_attn==2.5.8 _** below i have updated to show my current flash-attn built from source after installing via pip. ``` How to Use...