David Humphrey
David Humphrey
My guess is that you have to tell wrangler about this, see https://developers.cloudflare.com/workers/wrangler/configuration/#bundling
I guess wrangler doesn't support JSON modules? https://developers.cloudflare.com/workers/wrangler/bundling/
NOTE: I notice that you can also use `plugins: [{ "id": "web" }]` in the chat completion request, which might be easier than adding `:online`. See https://openrouter.ai/announcements/introducing-web-search-via-the-api
I would put any UI for selecting/de-selecting this into the Preferences Modal vs. adding to the prompt area, which is already too busy.
This has been requested again on Discord recently: > Basically, there's no evidence of reasoning happening (or to set reasoning budget/amount/thresholds) so I'm not sure that reasoning is actually occurring....
Some more background from Claude: # Streaming Reasoning Messages in Chat Completions When streaming chat completions with reasoning models (like o1), the reasoning content is returned through **delta chunks** in...
So perhaps when we render the streaming message, we can have a separate area for the reasoning in the UI. Maybe we do this as @tarasglek suggests with summary and...
See also https://platform.openai.com/docs/guides/reasoning
@Amnish04 you're asking good questions, and running into the limits of my knowledge of "reasoning," which I haven't used. I'm ambivalent about the Responses API, but do note that [OpenRouter...
I think supporting https://openrouter.ai/docs/use-cases/reasoning-tokens makes the most sense. I'd love to switch to only supporting OpenRouter tbh, but that's beyond this.