rag-chat icon indicating copy to clipboard operation
rag-chat copied to clipboard

NotFoundError: 404 Cannot POST /llm/v1/chat/completions

Open jin-kazama-codes opened this issue 10 months ago • 2 comments

My code was working perfectly fine for around 2 months but now all of a sudden I have started encountering an error while asking question in chat, I get error in the response of http://localhost:3000/api/chat-stream

`import { RAGChat, upstash } from "@upstash/rag-chat"; import { redis } from "./redis";

export const ragChat = new RAGChat({ model: upstash("mistralai/Mistral-7B-Instruct-v0.2", { maxTokens: 512, temperature: 0.7, topP: 0.9, }), redis: redis, });`

NotFoundError: 404 Cannot POST /llm/v1/chat/completions at APIError.generate (webpack-internal:///(rsc)/./node_modules/openai/error.mjs:69:20) at OpenAI.makeStatusError (webpack-internal:///(rsc)/./node_modules/openai/core.mjs:333:65) at OpenAI.makeRequest (webpack-internal:///(rsc)/./node_modules/openai/core.mjs:377:30) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async eval (webpack-internal:///(rsc)/./node_modules/@langchain/openai/dist/chat_models.js:1378:29) at async RetryOperation.eval [as _fn] (webpack-internal:///(rsc)/./node_modules/p-retry/index.js:50:12) { status: 404,

Image

`import { ragChat } from "@/lib/rag-chat"; import { aiUseChatAdapter } from "@upstash/rag-chat/nextjs"; import { NextRequest } from "next/server";

export const POST = async (req: NextRequest) => { const { messages, sessionId } = await req.json();

const lastMessage = messages[messages.length - 1].content;

const response = await ragChat.chat(lastMessage, { streaming: true, sessionId }); console.log('response', response); return aiUseChatAdapter(response); }; `

jin-kazama-codes avatar Apr 17 '25 11:04 jin-kazama-codes

do u got any solution ?

Dru-429 avatar Jun 20 '25 14:06 Dru-429

hello. we stopped hosting llm models. please use openrouter or any other llm provider.

enesakar avatar Oct 01 '25 16:10 enesakar