Mika Laitio
Mika Laitio
@mritunjaymusale Sorry, I have not had myself much time yet to work with the stable diffusion. @daniandtheweb Do you have any idea?
I tested the comfyui, by creating a venv from rocm sdk python and then by installing the whl files that rocm sdk builder builded and then finally the comfyui itself....
- we can now build and launch following extra tools with the latest rocm_sdk_builder. What is missing is the better documentation for this so that people know it. Not sure...
I have now done pretty extensive update to README.md, spend 2 days for reorganizing and rewriting it. Let me know if you think that something is still missing. If you...
Fixes https://github.com/ryujaehun/pytorch-gpu-benchmark/issues/29
Thanks for noticing this, I am fixing this now and also updating to latest llama.cpp for now. I was planning to llama-cli in the weekend with the deepseek r1. Another...
Should now be fixed, I checked that the code goes now on cuda files to HIP specific if blocks. Can you verify?
- LLAMA_CPP_LIB_PATH I have not myself played with the llama_cpp_python yet. Should I put LLAMA_CPP_LIB_PATH to env_rocm.sh so it would be enabled on runtime. Or/And to binfo/envsetup.sh so that babs.sh...
@karthikbabuks I integrated the llama-cpp-python, you should be able to get it now ./babs.sh -up ./babs.sh -b binfo/extra/ai_tools.blist I created a separate issue for instructlab with manual build instructions on...
I added the LLAM_CPP_LIB_PATH now to binfo/env/env_rocm_template.sh that is used by b info/core/001_rocm_core.binfo during build time to create the /opt/rocm_sdk_612/bin/env_setup.sh You should now be able get it by ./babs.sh -up...