Support commit message generation (AI)
Similar to what the frontend supports
Something I'd find interesting is to be able to use the user-configured locally hosted models first, which seems most straightforward as no authentication is required. It would be interesting to see how result streaming is implemented in the backend as well.
Something I'd find interesting is to be able to use the user-configured locally hosted models first, which seems most straightforward as no authentication is required. It would be interesting to see how result streaming is implemented in the backend as well.
Certainly - we have this supported in the frontend already via Ollama, but all of the implementation is in the typescript, so we have to do something new for this
The future I imagine has the AI access implemented in the backend entirely, and the frontend is able to use it with streaming support. The latter is… unclear to me how to do with tauri, all I know it has is background task support which has no channel for progress.
On the bright side, the backend wouldn't need cross-site-scripting allowances.