redis-js icon indicating copy to clipboard operation
redis-js copied to clipboard

redis.set randomly stops working

Open jacobtt21 opened this issue 11 months ago • 10 comments

It's seemingly random but when I sometimes set some data in the redis db, it just doesn't save. I see the key in the database but not the value.

await redis.set(email, {
  name: name
},
{
  ex: 60 * 60 * 24,
  nx: true
})

when I check the data browser in my dashboard, I can find the key but the value looks like this:

{}

jacobtt21 avatar Mar 22 '24 23:03 jacobtt21

It happens randomly too and I cannot predict when and why it happens

jacobtt21 avatar Mar 22 '24 23:03 jacobtt21

In NextJS if you don't await correctly in your serverless functions sometimes lambdas don't work properly maybe thats the case? If it persists contact us via [email protected].

ogzhanolguncu avatar Mar 25 '24 08:03 ogzhanolguncu

How do you correctly use await?

jacobtt21 avatar Mar 25 '24 20:03 jacobtt21

What I meant was that sometimes, floating promises (promises that you don't await) behave differently in serverless functions. Perhaps that was the issue?

ogzhanolguncu avatar Mar 26 '24 10:03 ogzhanolguncu

I encounter similar issue as the others

when i use nextjs runtime = "nodejs", my chatbot can work properly. but when i use runtime="edge", it works fine in dev environment however, when pushed onto vercel, it leads to very unpredictable outcome. sometimes, the new chat is written into upstash redis but sometimes, it doesn't. anyone knows how can i resolve this issue? below is my api/chat/route.ts code...

export const runtime = "nodejs";

interface NextExtendedRequest extends NextRequest {
  json: () => Promise<{
    messages: VercelChatMessage[];
    sessionId: string;
    loadMessages: boolean;
  }>;
}

export async function POST(req: NextExtendedRequest) {
  try {
    const { messages, sessionId } = await req.json();

    const { stream, handlers } = LangChainStream();

    const prompt = ChatPromptTemplate.fromMessages([
      SystemMessagePromptTemplate.fromTemplate(
        "You are a professor and your reply will be short and concise."
      ),
      new MessagesPlaceholder("history"),
      HumanMessagePromptTemplate.fromTemplate("{input}"),
    ]);

    const memory = new BufferMemory({
      chatHistory: new UpstashRedisChatMessageHistory({
        sessionId: sessionId,
        client: Redis.fromEnv(),
      }),
      aiPrefix: "assistant",
      humanPrefix: "user",
      memoryKey: "history",
      returnMessages: true,
    });

    const model = new ChatOpenAI({
      model: "gpt-3.5-turbo",
      temperature: 0,
      streaming: true,
      apiKey: process.env.OPENAI_API_KEY,
    });

    const chain = new ConversationChain({ llm: model, memory, prompt });

    const latestMessage = messages[messages.length - 1].content;

    chain.call({
      input: latestMessage,
      callbacks: [handlers],
    });

    return new StreamingTextResponse(stream);
  } catch (e: any) {
    console.log("error", e);
    return NextResponse.json({ error: e.message }, { status: e.status ?? 500 });
  }
}

myhendry avatar Apr 21 '24 04:04 myhendry

@myhendry can you try converting chain.call to this chain.invoke then await? Sometimes floating promises in edge environments cause that issue.

ogzhanolguncu avatar Apr 25 '24 11:04 ogzhanolguncu

@myhendry can you try converting chain.call to this chain.invoke then await? Sometimes floating promises in edge environments cause that issue.

@ogzhanolguncu hi, im using runtime="nodejs" environment instead of runtime="edge". let me try using invoke. so code will be... i dont need the "await" keyword before chain.invoke right? thanks

 chain.invoke({
      input: latestMessage,
      callbacks: [handlers],
    });

myhendry avatar Apr 26 '24 00:04 myhendry

No, if you are streaming you don't have to. If you are not streaming you have to await till you get a result back.

ogzhanolguncu avatar Apr 26 '24 09:04 ogzhanolguncu