LLaMA-Adapter
LLaMA-Adapter copied to clipboard
Mac M1 Pro GPU compatibility
I have errors running inference in Mac M1 pro. Is it possible to adapt it to run in Mac M1 Pro? If so, can some suggest the changes?