ai icon indicating copy to clipboard operation
ai copied to clipboard

Router cache issue with NextJS and Vercel AI SDK

Open aneequrrehman opened this issue 1 year ago • 5 comments

Description

First of all, thank you for this library - it's great.

Secondly, this might not be an issue and could just be something I'm missing. So, I'm using the Vercel AI SDK with Next.js and experiencing an issue with client-side caching. When a user sends a message, the AI response is streamed to the browser, but if the user navigates away and comes back to the same page, the AI response (last message) is not shown due to the client-side Router cache.

Implementation

I'm loading the messages via server action using the conversationId from the URL. This data helps set the initial states for AI and UI via initialAIState and initialUIState props on the provider returned from createAI function.


export default async function ChatPage({ params: { id } }) {
    const aiConversation = await getAiConversation(id)

    if (aiConversation === null) {
        return <div>Chat not found...</div>
    }
        
    return <ChatAiProvider
        initialAIState={{
	    aiConversationId: aiConversation.id,
	    messages: aiConversation.messages.map((m) => ({
                role: m.from === 'ai' ? 'assistant' : 'user',
	        content: m.content,
	    })),
	}}
	initialUIState={aiConversation.messages.map((m) => ({
            id: m.id,
	    display: m.from === 'ai' ? (
                <AiMessage>
		    <Markdown>{m.content}</Markdown>
		</AiMessage>
	    ) : (
	        <UserMessage>{m.content}</UserMessage>
	    ),
	}))}
	>
         <Chat />
    </ChatAiProvider>
}

I do trigger revalidatePath to invalidate the client cache when the user sends the message but the issue arises after the AI responds to a user message - since it's streamed to the browser.

I tried moving it to client-side but createAI seems to only work server-side.

Update:

I could return InvalidateCache component but that seems like a hack plus what if user navigates away without letting the stream finish - the client caching problem persists.

...
const ui = render({
	model: 'gpt-3.5-turbo',
	provider: openai,
	messages: aiState.get().messages,
	text: ({ content, done }) => {
		if (done) {
			aiState.done({
				...aiState.get(),
				messages: [
					...aiState.get().messages,
					{
						role: 'assistant',
						content,
					},
				],
			})
		}

		return (
			<>
				<AiMessage>
					<Markdown>{content}</Markdown>
				</AiMessage>
				<InvalidateCache /> <-- this component can invalidate client cache using router.refresh()
			</>
		)
	},
})

return {
	id: Date.now().toString(),
	display: ui,
}

So, I was wondering is there a plan to implement something for these scenarios. Or am I doing something wrong here?

aneequrrehman avatar Apr 29 '24 04:04 aneequrrehman

I'm having the same issue, and in fact the Vercel AI Chatbot also has the same bug (to repro: create two chats, type in one, click the other and come back).

So far the only thing that worked for me was manually refreshing the client router like OP suggested. The problem exists in [email protected], also for the streamUI method as well.

jennnx avatar Jun 08 '24 04:06 jennnx

I am seeing the same issue and created a bug for ai-chatbot. I am using ai version 3.0.12. While debugging I am seeing issue with useUIState hook. Provider created by createAI is getting the updated values but useUIState is not returning the updated state.

pravinjakfs avatar Jun 14 '24 18:06 pravinjakfs

Having the same issue here. Using ai version 3.1.1. Would be great to get some response form Vercel to this!

parikkap avatar Aug 05 '24 06:08 parikkap

Same issues here also, using ai version 3.1.1 and Next.js 14.2.3.

PetteriSuominen avatar Aug 05 '24 08:08 PetteriSuominen

additional messages will allow the previous messages to render, the problem is the last message received from the AI will not render when returning to the page

davidjonesdialexa avatar Sep 25 '24 20:09 davidjonesdialexa