satyadevai
satyadevai
@injeniero I am looking into this issue, if possible, please provide reproduceble example.
Hi @injeniero I tried multiple ways, I was unable to reproduce this issue, can you provide the steps to reproduce or simple python script which will reproduce this issue?
Hi @injeniero any update?
Hi @injeniero We haven’t heard from you, so I am closing this ticket. Please feel free to reopen if the issue still persists.
Hi @caroger I’ve reviewed this request and noticed that we don’t currently support invoke_inline_agent. At the moment, we only support invoke_agent, which is a different implementation. Do we need to...
I haven't hered from @arizedatngo or @sean_lee in slack. I am closing this ticket for now. feel free to reopen this ticket if required any workaround.
@mikeldking @caroger The changes have been implemented in the following MR and merged into the develop branch. This ticket can be closed. https://github.com/Arize-ai/openinference/pull/1486
Hi @nate-mar, I was able to reproduce this issue. This request is an audio streaming request, which uses an async binary streaming API and expects a response of type AsyncStreamedBinaryAPIResponse....
Response is recorded as function in string format.
@caroger @mikeldking Dspy has changed the default behavior of max_tokens and temperature across versions. In older versions, max_tokens was set to 1000, and in version v2.6.24 it was increased to...