NotFoundError: 404 Cannot POST /llm/v1/chat/completions
My code was working perfectly fine for around 2 months but now all of a sudden I have started encountering an error while asking question in chat, I get error in the response of http://localhost:3000/api/chat-stream
`import { RAGChat, upstash } from "@upstash/rag-chat"; import { redis } from "./redis";
export const ragChat = new RAGChat({ model: upstash("mistralai/Mistral-7B-Instruct-v0.2", { maxTokens: 512, temperature: 0.7, topP: 0.9, }), redis: redis, });`
NotFoundError: 404 Cannot POST /llm/v1/chat/completions at APIError.generate (webpack-internal:///(rsc)/./node_modules/openai/error.mjs:69:20) at OpenAI.makeStatusError (webpack-internal:///(rsc)/./node_modules/openai/core.mjs:333:65) at OpenAI.makeRequest (webpack-internal:///(rsc)/./node_modules/openai/core.mjs:377:30) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async eval (webpack-internal:///(rsc)/./node_modules/@langchain/openai/dist/chat_models.js:1378:29) at async RetryOperation.eval [as _fn] (webpack-internal:///(rsc)/./node_modules/p-retry/index.js:50:12) { status: 404,
`import { ragChat } from "@/lib/rag-chat"; import { aiUseChatAdapter } from "@upstash/rag-chat/nextjs"; import { NextRequest } from "next/server";
export const POST = async (req: NextRequest) => { const { messages, sessionId } = await req.json();
const lastMessage = messages[messages.length - 1].content;
const response = await ragChat.chat(lastMessage, { streaming: true, sessionId }); console.log('response', response); return aiUseChatAdapter(response); }; `
do u got any solution ?
hello. we stopped hosting llm models. please use openrouter or any other llm provider.