langchaingo
langchaingo copied to clipboard
Add groq completion example
Add usage example for groq
groq offers the same API like OpenAPI
https://github.com/tmc/langchaingo/issues/797
PR Checklist
- [x] Read the Contributing documentation.
- [x] Read the Code of conduct documentation.
- [x] Name your Pull Request title clearly, concisely, and prefixed with the name of the primarily affected package you changed according to Good commit messages (such as
memory: add interfaces for X, Y
orutil: add whizzbang helpers
). - [x] Check that there isn't already a PR that solves the problem the same way to avoid creating a duplicate.
- [x] Provide a description in this PR that addresses what the PR is solving, or reference the issue that it solves (e.g.
Fixes #123
). - [x] Describes the source of new concepts.
- [x] References existing implementations as appropriate.
- [x] Contains test coverage for new functions.
- [x] Passes all
golangci-lint
checks.
@devalexandre thanks for approving it :)
@tmc can you merge it?
Using GenerateContent
method this implementation doesn't work completely. The request model of completions is different to openai defaults. Mine json request:
{
"model":"llama3-70b-8192",
"messages":[
{
"role":"system",
"content":[
{
"text":"some text...",
"type":"text"
},
{
"text":"some text...",
"type":"text"
}
]
},
{
"role":"user",
"content":[
{
"text":"hi",
"type":"text"
}
]
}
],
"temperature":0,
"max_tokens":300
}
Response:
API returned unexpected status code: 400: 'messages.0' : for 'role:system' the following must be satisfied[('messages.0.content' : value must be a string)]
@ImPedro29 Have you tried it with the latest version of langchaingo? I had the same error but it worked for me after upgrading it.
I updated and after some corrections to my embed engine, the same errors are appearing:
Using version v0.1.9
Error local: https://github.com/tmc/langchaingo/blob/3bb95e19f191fe570a857ff9d65768d68a499cf6/llms/openai/internal/openaiclient/chat.go#L338
POST https://api.groq.com/openai/v1/chat/completions
Body in my message above
It looks like groq doesn't support the same models that openai gpt supports... Maybe in a future groq update it could be
Edit: I just tested GenerateContent
method