openai-node
openai-node copied to clipboard
Importing & using AssistantStream breaks Angular SSR
Confirm this is a Node library issue and not an underlying OpenAI API issue
- [X] This is an issue with the Node library
Describe the bug
Just by adding the following lines in my new Angular project, I get an error.
Lines:
import { AssistantStream } from 'openai/lib/AssistantStream';
...
const stream = AssistantStream.fromReadableStream(response.body);
On compiling the code I get the following error:
7:34:19 PM [vite] Error when evaluating SSR module /main.server.mjs:
2024-06-12T19:34:19.534881356Z |- TypeError: Cannot convert undefined or null to object
2024-06-12T19:34:19.534883506Z at Function.getPrototypeOf (<anonymous>)
2024-06-12T19:34:19.534885641Z at eval (/usr/src/app/node_modules/web-streams-polyfill/dist/ponyfill.es2018.js:538:43)
2024-06-12T19:34:19.534887281Z at eval (/usr/src/app/node_modules/web-streams-polyfill/dist/ponyfill.es2018.js:9:68)
2024-06-12T19:34:19.534888777Z at node_modules/web-streams-polyfill/dist/ponyfill.es2018.js (/usr/src/app/node_modules/web-streams-polyfill/dist/ponyfill.es2018.js:12:1)
2024-06-12T19:34:19.534890467Z at __require2 (/usr/src/app/.angular/vite-root/chatsite/chunk-QU2RNZFD.mjs:51:50)
2024-06-12T19:34:19.534891964Z at eval (/usr/src/app/node_modules/openai/_shims/node-runtime.mjs:9:32)
2024-06-12T19:34:19.534893442Z at async instantiateModule (file:///usr/src/app/node_modules/vite/dist/node/chunks/dep-cNe07EU9.js:55058:9)
2024-06-12T19:34:19.534894944Z
Now I'm unsure whether or not this is something you guys can fix, or it should be somehow reported to Angular or the polyfill... but I thought I'd start here, because I really have no clue what's going on. The internals of the AssistantStream are somewhat beyond me and I have no experience with handling streams in Typescript.
Feel free to close this issue and report it to the right place - or I can do it with the right info.
To Reproduce
- Install Angular normally with
npm install -g @angular/cli
ng new my-app
As part of the installation prompts, choose to use SSR
- Add this lib
npm install openai
- Change the app.component.ts to include the openai AssistantStream
import { Component } from '@angular/core';
import { RouterOutlet } from '@angular/router';
import { AssistantStream } from 'openai/lib/AssistantStream';
import { HttpClient } from "@angular/common/http";
@Component({
selector: 'app-root',
standalone: true,
imports: [RouterOutlet],
templateUrl: './app.component.html',
styleUrl: './app.component.scss'
})
export class AppComponent {
title = 'test';
constructor(private http: HttpClient) {}
private test() {
this.http.post<any>('http://test', {}).subscribe((response) => {
const stream = AssistantStream.fromReadableStream(response.body);
});
}
}
- Run angular and watch it crash
npm start
Code snippets
x
OS
Linux Mint
Node version
20.9.0 and 22.1.0
Library version
4.51.0
I have confirmed that if you don't use SSR in the angular installation, there's no issue.
Hi @rbaarsma, taking a look at this. Will let you know if I have trouble reproducing the issue.
Hi, sorry we haven't been able to get to the bottom of this yet.
The repro steps you've provided look helpful, so I hope we'll be able to fix this at some point, but candidly I'm not confident we'll be able to get to it soon.
Can you let me know if the same problem happens with ChatCompletionStream?
(A PR from the community would be welcome here FWIW, but I do expect it to be quite tricky)
Not sure about AssitantStream, but I had the same error with just initializing openai in angular ssr. My workaround was to dynamically import it i.e.
import type OpenAI from 'openai';
// Dynamic import to avoid a type error, only called in browser
export async function getOpenAI() {
const { default: OpenAI } = await import('openai');
return new OpenAI({
...
);
}
@Jsuppers could you try the v5 beta? I suspect it may fix this issue implicitly https://community.openai.com/t/your-feedback-requested-node-js-sdk-5-0-0-alpha/1063774
@RobertCraigie yup looks like it solved my issue, thanks
awesome, thanks for confirming! I'm going to go ahead and close this issue then.