xlin0

Results 1 comments of xlin0

Thanks for the quick response! Many inference servers like Ollama and LM Studio expose OpenAI compatible end-points. They can run DeepSeek, Qwen, GLM and other models.