twitterbio
twitterbio copied to clipboard
How to debug OpenAIStream?
I forked this repo and created some small projects that use NextJS13 and Vercel edge runtime to communicate with openAI API.
In particular I'm using the OpenAIStream from this repo.
While everything works well in localhost, I'm constantly getting timeout errors in production.
I also deployed this project as is, though.
One possible issue might be that I've tweaked the max_tokens parameter to allow longer prompts and responses.
Overall, the problem is that I'm unable to debug what's going wrong in OpenAIStream at runtime.
I naively tried
} catch (e) {
+ console.log(e);
// maybe parse error
controller.error(e);
}
But I'm not getting any output in the vercel log
Any suggestion?
@francescogior you should be able to add error handling code within your readablestream like this example: https://github.com/Nutlope/twitterbio/pull/36/files#diff-c89794bfdf3704d491d50ca94167d6ca97f29bf175e9c07a9a8c5369cf1d60aaR49-R59