cortex.cpp
cortex.cpp copied to clipboard
feat: Support Function Calling in llama3.1
Goal
- llama3.1 should support Tool use in llama.cpp
- https://github.com/janhq/models/issues/16
Original post
Problem AFAICS, the current implementation does not have OpenAI Function Calling support. This would be a fantastic, powerful, and much needed feature.
Success Criteria Any OAI client can be used with Nitro, even (and especially) those that use OAI Function Calling.
Reference: https://platform.openai.com/docs/guides/function-calling https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools
Would be great to have this for the tensorRT backend as well, though I do not know if they support it.