ai
ai copied to clipboard
Getting intermittent 500 internal error when using streamObject with Gemini and Google Generative AI Provider
Description
Hi all. I'm having a great time using Vercel AI SDK with OpenAI. However, I'm experiencing a few blocking issues with Gemini and Google Generative AI Provider when using streamObject.
One of the issues I'm facing is that the streamObject call failed 90% of the time. This is the error that I received
RetryError [AI_RetryError]: Failed after 3 attempts. Last error: An internal error has occurred. Please retry or report in https://developers.generativeai.google/guide/troubleshooting
at _retryWithExponentialBackoff (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:280:13)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async fn (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:1431:11)
at async eval (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:215:22)
at async eval (webpack-internal:///(action-browser)/./src/app/vercel/gemini/actions.tsx:50:45) {
reason: 'maxRetriesExceeded',
errors: [
APICallError [AI_APICallError]: An internal error has occurred. Please retry or report in https://developers.generativeai.google/guide/troubleshooting
at eval (webpack-internal:///(action-browser)/./node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.mjs:437:14)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async postToApi (webpack-internal:///(action-browser)/./node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.mjs:344:28)
at async GoogleGenerativeAILanguageModel.doStream (webpack-internal:///(action-browser)/./node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/google/dist/index.mjs:301:50)
at async fn (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:1444:23)
at async eval (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:215:22)
at async _retryWithExponentialBackoff (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:268:12)
at async fn (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:1431:11)
at async eval (webpack-internal:///(action-browser)/./node_modules/.pnpm/[email protected][email protected][email protected][email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:215:22)
at async eval (webpack-internal:///(action-browser)/./src/app/vercel/gemini/actions.tsx:50:45) {
url: 'https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-pro-latest:streamGenerateContent?alt=sse',
requestBodyValues: [Object],
statusCode: 500,
responseHeaders: [Object],
responseBody: '{\n' +
' "error": {\n' +
' "code": 500,\n' +
' "message": "An internal error has occurred. Please retry or report in https://developers.generativeai.google/guide/troubleshooting",\n' +
' "status": "INTERNAL"\n' +
' }\n' +
'}\n',
cause: undefined,
isRetryable: true,
data: [Object]
}
]
Everything's fine when I use streamText.
Code example
export async function generate() {
'use server';
const system = 'You are a writing expert';
const prompt = 'Write a 100 words lorem ipsum';
const schema = z.object({
summary: z.string().describe('Short summary of the prompt'),
output: z.array(
z.object({
title: z.string().describe('The title of the writing'),
content: z.string().describe('The content of the writing'),
})
),
});
const stream = createStreamableValue();
(async () => {
try {
const client = createGoogleGenerativeAI({
apiKey: env.API_KEY,
});
const model = client('models/gemini-1.5-pro-latest');
const { partialObjectStream } = await streamObject({
model,
system,
prompt,
schema,
});
for await (const partialObject of partialObjectStream) {
stream.update(partialObject);
}
stream.done();
} catch (e) {
console.error(e);
stream.done();
}
})();
return { object: stream.value };
}
Additional context
My deps
"dependencies": {
"@ai-sdk/google": "^0.0.30",
"@ai-sdk/openai": "^0.0.40",
"ai": "^3.2.37",
"next": "14.2.5",
"openai": "^4.53.2"
}
Same here, trying to use streamObject with Gemini 1.5 Flash
same here bruh where are the fixes
Hi @lgrammel. Do you have any updates on this?
If this is a 500 from google that appears only part of the time, it's an issue with google not the ai sdk
Do you have a concrete scenario that reproduces the error 100% of the time?
Hi everyone, I have the error 100% of the time when i use the 'models/gemini-1.5-flash-latest' model with streamObject or generateObject but with the others models it works well, here is an example of an action i use :
'use server';
import { google } from '@ai-sdk/google';
import { generateObject } from 'ai';
import { z } from 'zod';
export const streamCourseObject = async ({
promptContent,
}: {
promptContent: string;
}) => {
'use server';
try {
const { object } = await generateObject({
model: google('models/gemini-1.5-flash-latest'),
prompt: `${process.env.COURSE_PROMPT}${promptContent}`,
schema: z.object({
course: z.object({
resume: z.array(
z.object({ htmlContent: z.string(), timestamp: z.number() }),
),
keyConcepts: z.array(z.string()),
}),
}),
});
return { course: object.course };
} catch (error) {
console.error('Error in streamCourseObject:', error);
return {
error:
"Sorry, we couldn't generate the course object. Please retry later",
};
}
};
Thanks
@Ahmed-OC if it works with the other models, but not with flash, then it's an issue on the Google side. If it was an issue with the AI SDK, it would not work with the other models.
Hey guys,
The solution for me was to add { mode: 'json' } to parameters.
@Ahmed-OC great, I'll switch the default mode to json https://github.com/vercel/ai/pull/2691
Updated default mode to json