tauri icon indicating copy to clipboard operation
tauri copied to clipboard

refactor(core): use webview's URI schemes for IPC

Open lucasfernog opened this issue 2 years ago • 13 comments

closes #4875

What kind of change does this PR introduce?

  • [ ] Bugfix
  • [ ] Feature
  • [ ] Docs
  • [ ] New Binding issue #___
  • [ ] Code style update
  • [x] Refactor
  • [ ] Build-related changes
  • [ ] Other, please describe:

Does this PR introduce a breaking change?

  • [ ] Yes, and the changes were approved in issue #___
  • [ ] No

Checklist

  • [ ] When resolving issues, they are referenced in the PR's title (e.g fix: remove a typo, closes #___, #___)
  • [ ] A change file is added if any packages will require a version bump due to this PR per the instructions in the readme.
  • [ ] I have added a convincing reason for adding this feature, if necessary

Other information

lucasfernog avatar Jun 08 '23 20:06 lucasfernog

A command returning a 150MB file now takes less than 60ms to resolve. Previously: almost 50 seconds.

lucasfernog avatar Jun 08 '23 20:06 lucasfernog

image

amrbashir avatar Jun 09 '23 00:06 amrbashir

Needs https://github.com/tauri-apps/wry/pull/970

lucasfernog avatar Jun 10 '23 17:06 lucasfernog

Also needs #4752 to fix the unit tests.

lucasfernog avatar Jun 10 '23 17:06 lucasfernog

We'll detect and use the custom protocol IPC on Linux if available when https://github.com/tauri-apps/wry/pull/969 lands. On older webkit2gtk, we'll need to stick with the current approach (JSON :( )

lucasfernog avatar Jun 11 '23 13:06 lucasfernog

With the latest changes returning a 150MB file takes 1.3s on Linux with both postMessage and custom protocols (the latter will require a feature flag because it requires webkit2gtk 2.40).

The existing postmessage responder was changed to check if the command return value is either an object, an array or a raw tauri::ipc::Response. In these cases, it'll use the tauri::ipc::Channel to send the response. The Channel evaluates a JS script that runs the custom protocol IPC to fetch the response stored in a Tauri state.

Example command:

use tauri::{command, ipc::Response};

#[command]
pub fn read_file() -> Response {
  Response::new(include_bytes!("../../../../test_video.mp4").to_vec())
}

lucasfernog avatar Jun 12 '23 18:06 lucasfernog

Updates on the 150mb command response benchmark: macOS: 70-80ms Linux: 1.3-1.5s (ipc.postMessage), 1.3-1.4s (custom protocol) Windows: 1.8s

kinda scary how much worse webview2 is... or maybe WKWebView is just too fast?

lucasfernog avatar Jun 12 '23 21:06 lucasfernog

I wonder if webview2's custom uri scheme (which only works on win10+ :/ ) is more optimized for this :thinking: Not that it would matter for us until we drop support for 7&8 :sweat_smile: Edit: Nevermind, it works differently than on macos/linux and it will keep using the same webresourcerequested event we already use.

Also, the measurement includes reading the file from disk right? Maybe windows is just really bad at that part too

FabianLars avatar Jun 13 '23 06:06 FabianLars

The file is included in the app with include_bytes!

Android performance is in the 700-800ms range. iOS 70-100ms.

lucasfernog avatar Jun 13 '23 12:06 lucasfernog

This PR is now ready, though we need a new wry release to publish.

lucasfernog avatar Jun 13 '23 12:06 lucasfernog

wry 0.29 is published! You could update the dep to it.

wusyong avatar Jun 13 '23 14:06 wusyong

So my question right out of the gate is: Why is this not implemented in wry directly?

JonasKruckenberg avatar Jun 14 '23 12:06 JonasKruckenberg

So my question right out of the gate is: Why is this not implemented in wry directly?

we actually need both the current IPC (postMessage) and custom protocol IPC.. wry already gives us the tools we need to so need to interfere there. Since wry is low level I don't think we should get this mess over there like handling Linux/Android/webkit2gtk-2.40 stuff.

lucasfernog avatar Jun 14 '23 13:06 lucasfernog

Having lurked in this thread since the PR was first made I'm quite delighted to see this merged! I have been using the alpha releases for a while now and wonder how I would go about using this new feature together with the new v2 api plugins?

Naively switching over the tauri and tauri-build dependencies to use the dev branch of course creates version conflicts with the fs plugin that I use. From where should I pull in the plugins?

ghost avatar Aug 10 '23 14:08 ghost

Naively switching over the tauri and tauri-build dependencies to use the dev branch of course creates version conflicts with the fs plugin that I use. From where should I pull in the plugins?

The v2 branch of plugins-workspace repo https://github.com/tauri-apps/plugins-workspace/tree/v2 or from the published crates on crates.io but probably you should wait for the next alpha release (soon to be released).

amrbashir avatar Aug 10 '23 14:08 amrbashir

On MacOS this breaks IPC using Remote Domains. Safari blocks the requests to localhost from other domains.

kris-ava avatar Aug 27 '23 15:08 kris-ava

I want to know how to use this new IPC?i update to v2,but the file transder between js and rust is also slow in Windows OS, Here is my Cargo. toml

[build-dependencies]
tauri-build = { version = "2.0.0-beta", features = [] }

[dependencies]
tauri = { version = "2.0.0-beta.13", features = ["custom-protocol"] }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
chrono = "0.4.31"
tokio = { version = "1.20", features = ["macros", "rt-multi-thread"] }
tauri-plugin-dialog = "2.0.0-beta.3"
tauri-plugin-http = "2.0.0-beta.3"
tauri-plugin-fs = "2.0.0-beta.3"


[features]
# This feature is used for production builds or when a dev server is not specified, DO NOT REMOVE!!
custom-protocol = ["tauri/custom-protocol"]

this is my rust code:

#[tauri::command]
async fn append_chunk_to_file(
    window: Window,
    path: String,
    chunk: Vec<u8>,
    end: bool,
) -> Result<(), String> {
    let current_time = Local::now().time();
    println!("enter rust time: {}", current_time);
    println!("start{:?}", Instant::now()); //收到函数时间Instant { t: 644913.1384745s }
    tokio::spawn(async move {
        let mut file = OpenOptions::new()
            .create(true)
            .append(true)
            .open(&path)
            .map_err(|e| e.to_string())
            .unwrap();
        file.write_all(&chunk).map_err(|e| e.to_string()).unwrap();
        if end {
            window.emit("insert", Payload { message: path }).unwrap();
        }
    });
    let instant = Instant::now();
    println!("end{:?}", instant); //结束函数时间Instant { t: 644913.1386845s }
    let current_time = Local::now().time();
    println!("return time: {}", current_time);
    Ok(())
}

i use it in js:

import {mkdir, readFile, writeFile, BaseDirectory} from "@tauri-apps/plugin-fs";
import {convertFileSrc, invoke} from "@tauri-apps/api/core";
const content = await file.arrayBuffer()
let content1 = new Uint8Array(content);
await invoke("append_chunk_to_file", {path: url, chunk: chunk1,end:true})

It took 8 seconds to transfer a 23MB image using the above code(most time take in se and de),but @lucasfernog say A command returning a 150MB file now takes less than 60ms to resolve. Previously: almost 50 seconds. Where did I go wrong? Do you have the correct example code?

Xiaobaishushu25 avatar Mar 30 '24 06:03 Xiaobaishushu25