gateway
gateway copied to clipboard
Don't you support langchainjs?
We do support LangchainJS. You can use the following snippet to use portkey managed version:
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({
modelName: "gpt-3.5-turbo",
openAIApiKey: "xxx",
configuration:{
baseURL: "https://api.portkey.ai/v1",
defaultHeaders: {
"x-portkey-api-key":"xxx",
"x-portkey-virtual-key":"openai-virtual-key"
}
}
});
Please let me know if you face any issues with this
Hey @zandko - Did this snippet work for you?
There are some problems with Tencent cloud serverless cloud function stream.
There will be some problems in deploying cloud functions, which can be accessed, but the content is directly obtained.
@zandko do you mean that you are not able to use Portkey with Tencent Cloud's Langchain integration? Would it be possible to share a code snippet?
export const createOpenai = () => { return new OpenAI({ baseURL: ‘xxxxx', apiKey: 'xxx', defaultHeaders: { "x-portkey-provider": "openai", } }); }; const response = await openai.chat.completions.create( { messages, ...params, stream: true, } as unknown as OpenAI.ChatCompletionCreateParamsStreaming, ); const stream = OpenAIStream(response); return new StreamingTextResponse(stream, { headers: { ...ResponseHeaders } });
On Feb 5, 2024, at 18:09, vrushankportkey @.***> wrote:
@zandko https://github.com/zandko do you mean that you are not able to use Portkey with Tencent Cloud's Langchain integration? Would it be possible to share a code snippet?
However, please note, the Langchain <> Portkey integration works and is tested for OpenAI solely
— Reply to this email directly, view it on GitHub https://github.com/Portkey-AI/gateway/issues/164#issuecomment-1926627959, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJBQXX7SNKE5TPEDOBRW6ZDYSCVVDAVCNFSM6AAAAABCII4WVCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRWGYZDOOJVHE. You are receiving this because you were mentioned.
Are you looking to use the open source gateway or the hosted API (which is on https://api.portkey.ai/v1) ?
I am deployed on the Tencent cloud serverless cloud function.
On Feb 5, 2024, at 18:49, vrushankportkey @.***> wrote:
Are you looking to use the open source gateway or the hosted API (which is on https://api.portkey.ai/v1) ?
— Reply to this email directly, view it on GitHub https://github.com/Portkey-AI/gateway/issues/164#issuecomment-1926702266, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJBQXX5UMRYNHDTGOITVLHDYSC2NDAVCNFSM6AAAAABCII4WVCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRWG4YDEMRWGY. You are receiving this because you were mentioned.
Can you share the exact errror?
No errors, just missing streaming transmission.
On Feb 5, 2024, at 22:58, vrushankportkey @.***> wrote:
Can you share the exact errror?
— Reply to this email directly, view it on GitHub https://github.com/Portkey-AI/gateway/issues/164#issuecomment-1927194962, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJBQXXZ77ZIIQLRSA4GXCTLYSDXSDAVCNFSM6AAAAABCII4WVCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRXGE4TIOJWGI. You are receiving this because you were mentioned.
The left side is your network management, and the right side is the direct use of openai api. It can be seen that the content on the left is straight out, while openai api has a streaming effect.
const logStream = await this.sales_agent_executor.streamLog(inputs);
for await (const chunk of logStream) {
if (chunk.ops?.length > 0 && chunk.ops[0].op === 'add') {
const addOp = chunk.ops[0];
if (addOp.path.startsWith('/logs/ChatOpenAI') && typeof addOp.value === 'string' && addOp.value.length) {
console.log(Date.now(), addOp.value, '===================');
}
}
}