[Go] Streaming Tool Usage with Gemini returns an error
Describe the bug When I try to run genkit.Generate using the googlegenai plugin with streaming and tools it returns the following error:
panic: invalid output: data did not match expected schema:
- message.content.0: Invalid type. Expected: object, given: null
goroutine 1 [running]:
main.main-range1(0x14000114480?, {0x1039ffe60?, 0x14000426680?})
/../dev/main.go:65 +0xfc
main.main.(*Flow[...]).Stream.func3(...)
/../golang/1.24.3/packages/pkg/mod/github.com/firebase/genkit/[email protected]/core/flow.go:139
main.main()
/../dev/main.go:63 +0x2c0
exit status 2
I have only been able to reproduce this bug with the googlegenai plugin. Tool calling appears to work with the Ollama plugin.
To Reproduce Use this module to test
package main
import (
"context"
"fmt"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/core"
"github.com/firebase/genkit/go/genkit"
"github.com/firebase/genkit/go/plugins/googlegenai"
)
type Input struct {
Location string `json:"file" jsonschema_description:"The location to check weather"`
}
func main() {
modelName := "googleai/gemini-2.0-flash"
ctx := context.Background()
g, err := genkit.Init(ctx,
genkit.WithPlugins(&googlegenai.GoogleAI{}),
genkit.WithDefaultModel(modelName),
)
if err != nil {
panic(err)
}
weatherTool := genkit.DefineTool(g, "getWeather", "Get weather in a location",
func(ctx *ai.ToolContext, i Input) (string, error) {
return fmt.Sprintf("The weather in %s is 19 degrees and partly cloudy", i.Location), nil
},
)
getWeatherFlow := genkit.DefineStreamingFlow(g, "getWeatherFlow",
func(ctx context.Context, message string, callback core.StreamCallback[string]) (string, error) {
resp, err := genkit.Generate(
ctx,
g,
ai.WithTools(weatherTool),
ai.WithSystem(`
You have access to a tool, getWeather that lets you get the current weather
in a location.
Always use tools if it is relevant to the user's request. Carefully
consider the user's input and determine if it contains inputs to your tools.
ALWAYS LET THE USER KNOW YOU ARE ABOUT TO USE TOOLS.
`),
ai.WithPrompt(message),
ai.WithStreaming(func(ctx context.Context, chunk *ai.ModelResponseChunk) error {
callback(ctx, chunk.Text())
return nil
}),
)
if err != nil {
return "", err
}
return resp.Message.Text(), nil
})
streamCh := getWeatherFlow.Stream(ctx, "What is the weather in Vancouver?")
for result, err := range streamCh {
if err != nil {
panic(err)
}
if result.Done {
fmt.Printf("Final Message: %s \n", result.Output)
} else {
fmt.Printf("Received Token Chunk: %s \n", result.Stream)
}
}
}
Just run with go run.
Expected behavior I would expect to see a stream of tokens followed by the complete output.
Screenshots N/A
Runtime (please complete the following information):
- OS: MacOS
- Version 15.3.1
** Go version
- 1.24.3
Additional context Add any other context about the problem here.
This patch fixed the issue for me. I need to figure out the best way to unit test this change (if anyone has any tips feel free to comment) but I can open a PR with a fix.
diff --git a/go/plugins/googlegenai/gemini.go b/go/plugins/googlegenai/gemini.go
index 89624e03..056ba94b 100644
--- a/go/plugins/googlegenai/gemini.go
+++ b/go/plugins/googlegenai/gemini.go
@@ -419,7 +419,7 @@ func generate(
// merge all streamed responses
var resp *genai.GenerateContentResponse
- var chunks []string
+ var chunks []*genai.Part
for chunk, err := range iter {
// abort stream if error found in the iterator items
if err != nil {
@@ -434,7 +434,7 @@ func generate(
return nil, err
}
// stream only supports text
- chunks = append(chunks, c.Content.Parts[i].Text)
+ chunks = append(chunks, c.Content.Parts[i])
}
// keep the last chunk for usage metadata
resp = chunk
@@ -445,7 +445,7 @@ func generate(
merged := []*genai.Candidate{
{
Content: &genai.Content{
- Parts: []*genai.Part{genai.NewPartFromText(strings.Join(chunks, ""))},
+ Parts: chunks,
},
},
}
Hello @quinlanjager.
Thanks for this finding. I was able to reproduce the issue you mentioned and patched the codebase with your proposal and it works just fine.
Initially, when we migrated to the new go-genai SDK, it was very feature limited and it has been maturing since then. That's why we had the limitation of only streaming text parts.
For your PR: I'd suggest to only keep the patch you posted here in gemini.go and, if you want, move the UTs inside gemini_test.go. I have some pending changes to make on top of gemini.go that will require a refactor in the plugin that would make the extra files you created to get removed.
Also, if you decide to move the UTs to gemini_test.go make sure to avoid using assertion libraries since we are trying to follow the internal Go style guide (https://g3doc.corp.google.com/go/g3doc/style/decisions.md?cl=head#assert)
Hi @hugoaguirre,
Thank you for your feedback. I have updated my PR to just include the patch.
I'm going to omit the unit tests in my PR. To test this behaviour without splitting up the logic, I'd have to unit test generate which I'm not particularly keen on. Mostly because an instance of genai.Client with a stubbed model is likely needed. I'd rather take a pragmatic approach here and focus on getting the fix in.
Sounds good. I'll push on top of your PR a couple of tests in our live test suites for both VertexAI and GoogleAI plugins to make sure we don't see this error again .