Naren Gogineni
Naren Gogineni
@amanintech thanks for reporting this, yes the fix looks good, you can raise it. I'm assigning the issue to you
closing as stale
we can capture and ignore unhandled process rejections with, I think we should add this to the gateway, especially useful for self deployed instances ```js process.on('uncaughtException', (err) => { console.error('Unhandled...
thanks for the report @flipace though the OpenAI API spec does not have `role` in streaming chunks, it is being sent in the first chunk in OpenAI streaming responses ideally...
I'm closing this PR because the maintainers have not reviewed or commented in a while and the changes are stale
can't change the id @horochx it's not backwards compatible for people already using this guardrail
@horochx openai streaming has no concept of an error chunk maybe we can wait for the first chunk before establishing a stream connection and return a failed http code
I spent some time on this, I was initially misreading what the PR is doing, this fix makes sense for the problem, one change we could do is to convert...
Hey @horochx in the current configuration, guardrails for streaming requests are executed after the stream is completed, are you trying to execute the plugin per chunk?