companion
companion copied to clipboard
Make streaming optional on LLM-related routes
The code chat endpoint is only available using streaming for now. This should be flexible