Open
f1shy-dev
opened this issue 9 months ago
•
0 comments
Can we get Groq as a supported provider for text inference? It wouldn’t be much work - it’s OpenAI compatible but doesn’t seem to have the “models” endpoint and doesn’t have parallel function calling.