ai icon indicating copy to clipboard operation
ai copied to clipboard

Only plain objects can be passed to Client Components from Server Components

Open Iven2132 opened this issue 1 year ago • 11 comments
trafficstars

Description

The server-action example on AI SDK docs gives me an error Warning: Only plain objects can be passed to Client Components from Server Components

Code example

"use server";

import { createStreamableValue } from "ai/rsc";
import { CoreMessage, streamText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";

const together = createOpenAI({
  apiKey: "myapi",
  baseURL: "https://api.together.xyz/v1",
});

export async function continueConversation(messages: CoreMessage[]) {

  const result = await streamText({
    model: together.completion("mistralai/Mixtral-8x7B-Instruct-v0.1"),
    messages,
  });

  const stream = createStreamableValue(result.textStream);
  return stream.value;
}
'use client'

import { type CoreMessage } from 'ai'
import { useState } from 'react'
import { continueConversation } from '@/lib/chat';
import { readStreamableValue } from 'ai/rsc'

export default function Chat() {
  const [messages, setMessages] = useState<CoreMessage[]>([])
  const [input, setInput] = useState('')
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map((m, i) => (
        <div key={i} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content as string}
        </div>
      ))}

      <form
        action={async () => {
          const newMessages: CoreMessage[] = [
            ...messages,
            { content: input, role: 'user' }
          ]

          setMessages(newMessages)
          setInput('')

          const result = await continueConversation(newMessages)

          for await (const content of readStreamableValue(result)) {
            setMessages([
              ...newMessages,
              {
                role: 'assistant',
                content: content as string
              }
            ])
          }
        }}
      >
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.target.value)}
        />
      </form>
    </div>
  )
}

Additional context

using latest version of AI SDK 3.1.1

Iven2132 avatar May 06 '24 14:05 Iven2132

I encountered the same issue after migration to 3.1.x. I'm not sure if it's just gotten more strict and is no longer facilitating existing bad practices in my code, if I migrated wrong, or if this is just a bug.

My migration was as simple as the very straightforward swap from render => streamUI, unstable_onGet/SetState in AIProvider, and I tried both direct { openai } and createProvider from the ai-sdk for the new provider.

For me, the issue appears even when just returning a text response with nothing else provided as an option (no tool calls, etc).

Loading chats seems to work for text responses, but initial send throws error after textStream finishes according to server console.

To be clear, it's the same error as @Iven2132 . 'Only plain objects can be passed @ stringify ' or something like that to paraphrase (I ended up rolling back to 3.0.x., and I didn't record the exact error).

Like I'm sure Iven is, happy to test out any suggestions. If I get time I will put together a minimal reproduction and hopefully shed additional light.

shaded-blue avatar May 06 '24 16:05 shaded-blue

@Iven2132 @shaded-blue which next.js versions are you using? I just tested the example on the latest next js and it worked for me.

lgrammel avatar May 07 '24 08:05 lgrammel

@Iven2132 @shaded-blue which next.js versions are you using? I just tested the example on the latest next js and it worked for me.

Using Next.js 14.2.3 and AI SDK 3.1.1 and @ai-sdk/openai 0.0.9

Iven2132 avatar May 07 '24 08:05 Iven2132

@Iven2132 hm I'm using the same. Can you share your layout.tsx content as well?

lgrammel avatar May 07 '24 09:05 lgrammel

@Iven2132 hm I'm using the same. Can you share your layout.tsx content as well?

Yes:


import type { Metadata } from "next";
import { Inter } from "next/font/google";
import "./globals.css";

const inter = Inter({ subsets: ["latin"] });

export const metadata: Metadata = {
  title: "Create Next App",
  description: "Generated by create next app",
};

export default function RootLayout({
  children,
}: Readonly<{
  children: React.ReactNode;
}>) {
  return (
    <html lang="en">
      <body className={inter.className}>{children}</body>
    </html>
  );
}

Iven2132 avatar May 07 '24 09:05 Iven2132

Strange. Here is my working setup:

actions.tsx

'use server';

import { createStreamableValue } from 'ai/rsc';
import { CoreMessage, streamText } from 'ai';
import { openai } from '@ai-sdk/openai';

export async function continueConversation(messages: CoreMessage[]) {
  const result = await streamText({
    model: openai('gpt-4-turbo'),
    messages,
  });

  const stream = createStreamableValue(result.textStream);
  return stream.value;
}

layout.tsx

import type { Metadata } from 'next';
import { Inter } from 'next/font/google';
import './globals.css';

const inter = Inter({ subsets: ['latin'] });

export const metadata: Metadata = {
  title: 'Create Next App',
  description: 'Generated by create next app',
};

export default function RootLayout({
  children,
}: Readonly<{
  children: React.ReactNode;
}>) {
  return (
    <html lang="en">
      <body className={inter.className}>{children}</body>
    </html>
  );
}

page.tsx

'use client';

import { type CoreMessage } from 'ai';
import { useState } from 'react';
import { continueConversation } from './actions';
import { readStreamableValue } from 'ai/rsc';

export default function Chat() {
  const [messages, setMessages] = useState<CoreMessage[]>([]);
  const [input, setInput] = useState('');
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map((m, i) => (
        <div key={i} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content as string}
        </div>
      ))}

      <form
        action={async () => {
          const newMessages: CoreMessage[] = [
            ...messages,
            { content: input, role: 'user' },
          ];

          setMessages(newMessages);
          setInput('');

          const result = await continueConversation(newMessages);

          for await (const content of readStreamableValue(result)) {
            setMessages([
              ...newMessages,
              {
                role: 'assistant',
                content: content as string,
              },
            ]);
          }
        }}
      >
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.target.value)}
        />
      </form>
    </div>
  );
}

next: 14.2.3 react: 18.2.0 ai-sdk: 3.1.1

can you try the above? it should be very similar to what you have.

lgrammel avatar May 07 '24 09:05 lgrammel

Strange. Here is my working setup:

actions.tsx

'use server';

import { createStreamableValue } from 'ai/rsc';
import { CoreMessage, streamText } from 'ai';
import { openai } from '@ai-sdk/openai';

export async function continueConversation(messages: CoreMessage[]) {
  const result = await streamText({
    model: openai('gpt-4-turbo'),
    messages,
  });

  const stream = createStreamableValue(result.textStream);
  return stream.value;
}

layout.tsx

import type { Metadata } from 'next';
import { Inter } from 'next/font/google';
import './globals.css';

const inter = Inter({ subsets: ['latin'] });

export const metadata: Metadata = {
  title: 'Create Next App',
  description: 'Generated by create next app',
};

export default function RootLayout({
  children,
}: Readonly<{
  children: React.ReactNode;
}>) {
  return (
    <html lang="en">
      <body className={inter.className}>{children}</body>
    </html>
  );
}

page.tsx

'use client';

import { type CoreMessage } from 'ai';
import { useState } from 'react';
import { continueConversation } from './actions';
import { readStreamableValue } from 'ai/rsc';

export default function Chat() {
  const [messages, setMessages] = useState<CoreMessage[]>([]);
  const [input, setInput] = useState('');
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map((m, i) => (
        <div key={i} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content as string}
        </div>
      ))}

      <form
        action={async () => {
          const newMessages: CoreMessage[] = [
            ...messages,
            { content: input, role: 'user' },
          ];

          setMessages(newMessages);
          setInput('');

          const result = await continueConversation(newMessages);

          for await (const content of readStreamableValue(result)) {
            setMessages([
              ...newMessages,
              {
                role: 'assistant',
                content: content as string,
              },
            ]);
          }
        }}
      >
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.target.value)}
        />
      </form>
    </div>
  );
}

next: 14.2.3 react: 18.2.0 ai-sdk: 3.1.1

can you try the above? it should be very similar to what you have.

Can you share your project? like a GitHub repo?

Iven2132 avatar May 07 '24 09:05 Iven2132

i've modified examples/next-ai-rsc locally with the changes above

lgrammel avatar May 07 '24 09:05 lgrammel

Here's the branch: https://github.com/vercel/ai/tree/lg/issue-1501/examples/next-ai-rsc

lgrammel avatar May 07 '24 09:05 lgrammel

Here's the branch: https://github.com/vercel/ai/tree/lg/issue-1501/examples/next-ai-rsc

the error only comes when I use mistralai/Mixtral-8x7B-Instruct-v0.1 by together.ai, the code works fine with gpt-3.5.

Can you please try to use models by together ai?

Iven2132 avatar May 07 '24 09:05 Iven2132

@lgrammel let me know if you can reproduce it

Iven2132 avatar May 09 '24 14:05 Iven2132

Could be unrelated, but I was running into this while creating an Azure OpenAI provider. The first chunk of the stream was failing zod validation which triggered this code:

if (!chunk.success) {
  controller.enqueue({ type: "error", error: chunk.error });
  return;
}

Sending over the full error resulted in the Only plain objects can be passed to Client Components from Server Components. I changed chunk.error to chunk.error.message and the above warning was resolved.

patrick-moore avatar May 25 '24 18:05 patrick-moore

@patrick-moore interesting, this could be related.

lgrammel avatar May 27 '24 17:05 lgrammel

@Iven2132 i don't have together.ai access. can you reproduce with another provider such as fireworks or groq?

lgrammel avatar May 27 '24 17:05 lgrammel

So I was able to reproduce this with the Together API and there are a couple things that happen that causes the failure.

Together API uses 'eos' as finish_reason to mark the end of a text generation, however, the OpenAI provider that you're proxying the endpoint with only supports 'stop' to mark the end of the stream, so the there is a schema mismatch when it encounters 'eos' and throws an error.

Now this error is returned as an Error object as part of the stream {error: Error}, which is not a supported type that can be passed from a server action to a client component, hence why you're seeing the Only plain objects can be passed to Client Components from Server Components message.

There are two ways to fix this:

Add Custom Provider

Create a custom provider for Together API that supports their unique implementation details so 'eos' is considered a valid stop sequence.

Try Catch Block

Add a try...catch block when you're reading from the stream and handle the error quietly.

try {
  const result = await continueConversation(newMessages);

  for await (const content of readStreamableValue(result)) {
    setMessages([
      ...newMessages,
      {
        role: "assistant",
        content: content as string,
      },
    ]);
  }
} catch (error) {
  console.error(error);
}

jeremyphilemon avatar Jun 04 '24 16:06 jeremyphilemon

I'm relaxing the Zod checking here: https://github.com/vercel/ai/pull/1835/files#diff-acdaca42c343ab5155c2f7f101611d05fff503a09e2c4ddf3992beffafa09f39 so it might work with upcoming openai provider versions

lgrammel avatar Jun 05 '24 07:06 lgrammel