redis-js
redis-js copied to clipboard
redis.set randomly stops working
It's seemingly random but when I sometimes set some data in the redis db, it just doesn't save. I see the key in the database but not the value.
await redis.set(email, {
name: name
},
{
ex: 60 * 60 * 24,
nx: true
})
when I check the data browser in my dashboard, I can find the key but the value looks like this:
{}
It happens randomly too and I cannot predict when and why it happens
In NextJS if you don't await
correctly in your serverless functions sometimes lambdas don't work properly maybe thats the case? If it persists contact us via [email protected].
How do you correctly use await?
What I meant was that sometimes, floating promises (promises that you don't await) behave differently in serverless functions. Perhaps that was the issue?
I encounter similar issue as the others
when i use nextjs runtime = "nodejs", my chatbot can work properly. but when i use runtime="edge", it works fine in dev environment however, when pushed onto vercel, it leads to very unpredictable outcome. sometimes, the new chat is written into upstash redis but sometimes, it doesn't. anyone knows how can i resolve this issue? below is my api/chat/route.ts code...
export const runtime = "nodejs";
interface NextExtendedRequest extends NextRequest {
json: () => Promise<{
messages: VercelChatMessage[];
sessionId: string;
loadMessages: boolean;
}>;
}
export async function POST(req: NextExtendedRequest) {
try {
const { messages, sessionId } = await req.json();
const { stream, handlers } = LangChainStream();
const prompt = ChatPromptTemplate.fromMessages([
SystemMessagePromptTemplate.fromTemplate(
"You are a professor and your reply will be short and concise."
),
new MessagesPlaceholder("history"),
HumanMessagePromptTemplate.fromTemplate("{input}"),
]);
const memory = new BufferMemory({
chatHistory: new UpstashRedisChatMessageHistory({
sessionId: sessionId,
client: Redis.fromEnv(),
}),
aiPrefix: "assistant",
humanPrefix: "user",
memoryKey: "history",
returnMessages: true,
});
const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
streaming: true,
apiKey: process.env.OPENAI_API_KEY,
});
const chain = new ConversationChain({ llm: model, memory, prompt });
const latestMessage = messages[messages.length - 1].content;
chain.call({
input: latestMessage,
callbacks: [handlers],
});
return new StreamingTextResponse(stream);
} catch (e: any) {
console.log("error", e);
return NextResponse.json({ error: e.message }, { status: e.status ?? 500 });
}
}
@myhendry can you try converting chain.call
to this chain.invoke
then await
? Sometimes floating promises in edge environments cause that issue.
@myhendry can you try converting
chain.call
to thischain.invoke
thenawait
? Sometimes floating promises in edge environments cause that issue.
@ogzhanolguncu hi, im using runtime="nodejs" environment instead of runtime="edge". let me try using invoke. so code will be... i dont need the "await" keyword before chain.invoke right? thanks
chain.invoke({
input: latestMessage,
callbacks: [handlers],
});
No, if you are streaming you don't have to. If you are not streaming you have to await till you get a result back.