text-generation-webui
text-generation-webui copied to clipboard
GROK-1 Support
Please consider adding support for GROK-1 https://x.ai/blog/grok-os https://huggingface.co/xai-org/grok-1
I would really like too
+1 on that!
While waiting for official support I was able to get grok-1.Q4_K_M.gguf from mradermacher/grok-1-GGUF working. To do so I installed latest text-generation-webui CPU version for Windows, opened cmd_windows.bat and entered pip install git+https://github.com/abetlen/llama-cpp-python.git as llama-cpp added grok-1 support in https://github.com/ggerganov/llama.cpp/pull/6204 as discussed in https://github.com/ggerganov/llama.cpp/issues/6120.
For grok-1.Q4_K_M.gguf 256 GB RAM is easily enough - mradermacher lists 193.3 GB as requirement. I'm really happy with the output of grok-1 so far and am looking forward for the weighted/imatrix quants.
This issue has been closed due to inactivity for 2 months. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.