Ian Webster

Results 216 comments of Ian Webster

Haven't lost track of this. Just requires a bit of extra plumbing with the way it's currently implemented :thinking:

Yes, have a look at the documentation on Chat Conversations! https://promptfoo.dev/docs/providers/openai#chat-conversations This is in the OpenAI section as chat formatting is specific to the API (and as you know there...

Everything under `vars.messages` is part of the prompt. Any `assert` is run on the LLM output, i.e. the assistant response. ```yaml prompts: [prompt.json] providers: [openai:gpt-3.5-turbo] tests: - vars: messages: -...

Hi @JohnPeng47, Thanks for the suggestion. Definitely planning to move in this direction, including a self-hosted server - I also work with a team that would benefit greatly. Have you...

Closing this out as we implemented a server long ago

Hi @zeldrinn, Would the output depend on the value of `legal_case_text`? If so, can you structure it like this: ```yaml # ... tests: - vars: legal_case_text: 'first case text....' assert:...

Thanks for explaining - let me give this some thought and get back to you with a suggestion!

I could implement something like this, essentially what you asked for, an `assert` list associated with each prompt that will get merged into each individual test cases. ```yaml prompts: -...

Going to treat this one as low priority, since we have instructions for jest and vitest has a jest-compatible api

https://www.spacereference.org/comet/c-2014-w3-panstarrs ``` Exception Value: | could not convert string to float: -- | -- /app/spaceobjects/models.py in period_in_days, line 211 ``` Comet has no period due to hyperbolic orbit! https://ssd.jpl.nasa.gov/sbdb.cgi?ID=dK14W030