baml icon indicating copy to clipboard operation
baml copied to clipboard

Gemini Flash 2.0 Responses Can't be Parsed

Open trajan0x opened this issue 8 months ago • 2 comments

This is a recent issue.

Probably first occurred ~yesterday/day before (no updates on our end) and is occurring very consistently (80+% of the time).

Our client configuration is as follows:

// Configure Gemini as our replacement LLM provider
client<llm> Gemini2Flash {
  provider google-ai
  options {
    model "gemini-2.0-flash"
    api_key env.GEMINI_FLASH_API_KEY
  }
}

Most of the time that baml uses gemini flash, we get the following error:

Error Message
Failed to parse into a response accepted by baml_runtime::internal::llm_client::primitive::google::types::GoogleResponse: {"candidates":[{"content":{"role":"model"},"finishReason":"STOP"}],"usageMetadata":{"promptTokenCount":7826,"totalTokenCount":7826,"promptTokensDetails":[{"modality":"TEXT","tokenCount":7826}]},"modelVersion":"gemini-2.0-flash"}

Caused by:
    missing field `parts`

~We are on version 0.77.0 (I'll post an update to let you know if this occurs on 0.83.0)~

seeing this behavior on both 0.77.0 and 0.83.0

Because there were no code changes, we're thinking this is probably a change on geminis end.

I am seeing several sdk changes gemini wise, so would seem to be likely that that's the case:

https://x.com/btibor91/status/1909895821589458989

https://x.com/kimmonismus/status/1909903109876306192

trajan0x avatar Apr 09 '25 10:04 trajan0x

hmm seems like it's related to when gemini triggers its safety filters https://github.com/langchain-ai/langchainjs/issues/6371

aaronvg avatar Apr 09 '25 16:04 aaronvg