add opencl backend to ollama
Could we ~update the ollama package to 0.9.0 and~ support the llama.cpp OpenCL backend (perhaps in a subpackage named ollama-opencl)? The OpenCL backend of llama.cpp is optimized for Snapdragon 8 Gen 1/2/3 and could be useful as it can use GPU. Thanks.
ollama was updated to 0.9.0 automatically 4 days ago in commit https://github.com/termux/termux-packages/commit/d8a921d5ff4263ccc2bb8f8c78fbcc51ec5bc711.
If you wanna enable the OpenCL backend feel free to open a PR to do so.
ollama doesn't have OpenCL support, see ollama/ollama#4373. Termux's llama.cpp has enabled support for OpenCL IIRC but it doesn't enable the optimization for Adreno...