书上的蜗牛
Results
12
comments of
书上的蜗牛
You can run llamafile(GGUF) on your PC ,it's support cpu, and provide an OpenAI Compatibility API Modify this file (QAnything/tree/master/qanything_kernel/connector/llm /llm_for_online.py) to your local llm server endpoint
nice job