Jan Werder

Results 40 comments of Jan Werder

We encountered the problem as well. Adding the following lines to the promptflow YAML worked: ```yaml environment_variables: PF_DISABLE_TRACING: true ```

It seems like passing metadata doesn't work yet. I'm using litellm over smolagents ``` modified_query_response = self.query_model( query_messages, metadata ={"eval": {"session_id": session_id, "step": "modify_query"}} if session_id else None ) ```...

Using the original `metadata` would be the most correct and a breaking change isn't that bad if communicated in the changelogs. An alternative would be only pass the sub-array `_openai`...

I would tend to minor, but since sending data to third parties is involved, other might disagree.

@krrishdholakia How would you judge it. Is this something that can be adressed quickly or is a major discussion needed? I'm trying to judge wheter litellm can fit my requirement...

Sounds great, sure thing, I can help with that.

Nice, looks good to me. 👍

I've had a closer look and my inital commit didn't do the job proper. The problem seems to be that the verifying the TOTP with a test login takes too...

> Do we have an ETA on fixing the metadata field for OpenAI providers here? > > [DSPy](https://github.com/stanfordnlp/dspy/blob/main/requirements.txt) relies on LM and I will look to bump to latest litellm...