Plans to support DeepSeek?
Reviewing the documentation of DeepSeek, the example codes use python OpenAi library which indicates API parity with OpenAI.
Is there a plan to support DeepSeek models and endpoint?
Example: DeepSeek Structured output
from openai import OpenAI
client = OpenAI(
api_key="<your api key>",
base_url="https://api.deepseek.com", # <--- deepseek api endpoint
)
config := openai.DefaultConfig(os.Getenv("OPENAI_API_KEY"))
config.BaseURL = "https://api.deepseek.com"
client := openai.NewClientWithConfig(config)
resp, err := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
Model: "deepseek-chat",
Messages: []openai.ChatCompletionMessage{
{
Role: openai.ChatMessageRoleSystem,
Content: MakeSystemPrompt(),
},
{
Role: openai.ChatMessageRoleUser,
Content: prompt,
},
},
ResponseFormat: &openai.ChatCompletionResponseFormat{
Type: openai.ChatCompletionResponseFormatTypeJSONObject,
},
},
)
deepseek-reasoner What about this?
status code: 400, status: 400 Bad Request, message: %!s(
status code: 400, status: 400 Bad Request, message: %!s(), body: {"code":20024,"message":"Json mode is not supported for this model.","data":null}
Why are you asking it here? You are getting an error from deepseek, not from go-openai lib.
type ChatCompletionStreamChoiceDelta struct {
openai.ChatCompletionStreamChoiceDelta
ReasoningContent string json:"reasoning_content,omitempty"
}
type ChatCompletionStreamChoice struct {
openai.ChatCompletionStreamChoice
Delta ChatCompletionStreamChoiceDelta json:"delta"
}
type ChatCompletionStreamResponse struct {
openai.ChatCompletionStreamResponse
Choices []ChatCompletionStreamChoice json:"choices"
}
response := ChatCompletionStreamResponse{} rawLine, streamErr := stream.RecvRaw()
Rewrote it. OK
👍 @feng626 adding the reasoning_content single line can store the deepseek-r1 model streaming response.
how to get reasoning_content from resp
@wangriyu you can try to use this branch: https://github.com/panzhongxian/go-openai/tree/ds_r1