amica icon indicating copy to clipboard operation
amica copied to clipboard

I've integrated the Rust backend for process management and API proxy-

Open AlphaEcho11 opened this issue 4 months ago • 4 comments

This was a significant architectural change where I moved core functionality to the Rust backend of the Tauri application.

Here are the key changes:

  • Process Management: The Rust backend now launches and manages an external text-generation-webui process. You can configure the path to this executable via a new settings.json file. The backend ensures the process is terminated gracefully when the application exits.
  • API Proxy: I added a new proxy_request command in Rust. This command forwards API calls from the frontend to the text-generation-webui service, centralizing communication and hardening the application.
  • Frontend Refactoring: The KoboldAI chat feature in the frontend has been updated to use the new Rust proxy instead of making direct fetch calls.
  • Configuration: A settings.json file has been added to the root of the project to allow you to specify the path to your text-generation-webui executable.
  • Documentation: I also added a DEPLOYMENT.md guide to explain how you can set up and run the new version of the application.

Summary by CodeRabbit

  • New Features

    • Local desktop proxy for streamed chat responses and a blocking fallback for vision replies.
    • Quit confirmation flow for safer app exit.
  • Documentation

    • Added Deployment Guide for running the Rust-powered backend locally (setup, config, build, run).
  • Chores

    • Tauri/tooling upgraded and packaging/config keys reorganized; included app resources and tightened shell permissions.
  • Bug Fixes

    • Improved startup configuration validation, sidecar lifecycle management, path validation, and user-facing error dialogs.

AlphaEcho11 avatar Aug 13 '25 03:08 AlphaEcho11

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Project Deployment Preview Comments Updated (UTC)
amica Failed Aug 14, 2025 7:43am

vercel[bot] avatar Aug 13 '25 03:08 vercel[bot]

@google-labs-jules[bot] is attempting to deploy a commit to the heyamica Team on Vercel.

A member of the Team first needs to authorize it.

vercel[bot] avatar Aug 13 '25 03:08 vercel[bot]

Walkthrough

Adds a local Deployment guide and default settings; upgrades Tauri to 2.0.0-beta and adds reqwest/futures-util; rewrites the Tauri backend to manage a sidecar, validate/sanitize proxy paths, and add streaming/blocking proxy commands; updates tauri config and resources; frontend adds quit confirmation and proxies OpenAI streaming/blocking through Tauri.

Changes

Cohort / File(s) Summary of Changes
Documentation
DEPLOYMENT.md
New deployment guide for running Amica locally: prerequisites, install/config/build/run steps, OS config locations, settings.json format, artifacts, and runtime notes.
Tauri Cargo / Dependencies
src-tauri/Cargo.toml
Upgraded tauri to 2.0.0-beta.21 (removed shell-open feature); added reqwest = { version = "0.12.5", default-features = false, features = ["json","rustls-tls"] } and futures-util = "0.3.30".
Tauri Config & Resources
src-tauri/tauri.conf.json, src-tauri/resources/settings.json
Restructured tauri config (keys renamed/moved: devPath→devUrl, distDir→frontendDist, tauri/tauri.bundle → top-level bundle, added root-level version/identifier); added resources entry and src-tauri/resources/settings.json with {"text_generation_webui_path": ""}.
Tauri Backend Core
src-tauri/src/main.rs
Major rewrite: adds AppState (sidecar child + termination flag), Settings struct and settings load from OS config or bundled resource, executable path validation/sanitization, allowlisted proxy paths, spawning/forwarding sidecar stdout, graceful shutdown, and new commands: proxy_request_streaming, proxy_request_blocking, quit_app; updates close_splashscreen to a tauri command. Implements streaming primitives that POST to localhost and emit stream-chunk/stream-end/stream-error events.
Frontend App Lifecycle
src/pages/_app.tsx
Adds client-side listener for confirm-close that prompts the user and invokes quit_app on confirmation; the listener is cleaned up on unmount.
Frontend Chat Proxy Refactor
src/features/chat/openAiChat.ts
Replaces direct frontend OpenAI HTTP streaming with Tauri-backed proxy streaming via proxy_request_streaming and event listeners; blocking requests use proxy_request_blocking. Adds stream cleanup, API key validation, updated types/signatures (vision chat now returns Promise<string>).
Node Package Config
package.json
Upgraded Tauri JS packages to pinned 2.0.0-beta.x devDependencies: @tauri-apps/api and @tauri-apps/cli.

Sequence Diagram(s)

sequenceDiagram
  participant UI as Frontend (openAiChat)
  participant Tauri as Tauri Backend
  participant API as Local API (127.0.0.1:5000)

  UI->>Tauri: invoke proxy_request_streaming(path: v1/chat/completions, body, auth?)
  Tauri->>Tauri: validate & sanitize path (allowlist)
  Tauri->>API: POST /v1/chat/completions (stream: true)
  API-->>Tauri: SSE / streamed chunks
  loop For each chunk
    Tauri-->>UI: emit "stream-chunk" { chunk }
  end
  Tauri-->>UI: emit "stream-end"
  Note over UI: UI aggregates chunks into ReadableStream
sequenceDiagram
  participant App as Frontend (_app.tsx)
  participant Tauri as Tauri Backend
  participant Sidecar as Sidecar Process

  Tauri->>Tauri: On setup: load settings.json, validate path
  Tauri->>Sidecar: spawn sidecar (text-generation-webui)
  Sidecar-->>Tauri: stdout lines
  Tauri-->>App: emit "sidecar-output"

  App->>Tauri: user triggers close -> receives "confirm-close"
  App->>Tauri: invokes quit_app (on confirm)
  Tauri->>Sidecar: graceful shutdown
  Tauri->>Tauri: exit application

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~35 minutes

Poem

I nibble bytes beneath the moon,
A sidecar hums a steady tune.
Chunks hop out in tidy streams,
Settings snug in config dreams.
Build and run — the rabbit beams. 🥕✨

[!TIP]

🔌 Remote MCP (Model Context Protocol) integration is now available!

Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats.

✨ Finishing Touches
🧪 Generate unit tests
  • [ ] Create PR with unit tests
  • [ ] Post copyable unit tests in a comment

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

coderabbitai[bot] avatar Aug 13 '25 03:08 coderabbitai[bot]

Cool, Thanks. Will review.

slowsynapse avatar Aug 18 '25 11:08 slowsynapse