[bug] Change target for tauri dev doesn't work as expected
Describe the bug
Running my main.rs file with cargo run --target x86_64-apple-darwin works but when using yarn tauri dev --target x86_64-apple-darwin I get a bug concerning libtorch:
dyld[64838]: Library not loaded: @rpath/libtorch_cpu.dylib
Referenced from: <44B79842-D273-3676-9B7C-6C41F485BB99> /Users/santosh/Desktop/code/projects/ImageSearch/PixelDetective/src-tauri/target/x86_64-apple-darwin/debug/PixelDetective
Reason: tried: '/System/Volumes/Preboot/Cryptexes/OS@rpath/libtorch_cpu.dylib' (no such file), '/usr/local/lib/libtorch_cpu.dylib' (no such file), '/usr/lib/libtorch_cpu.dylib' (no such file, not in dyld cache)
My question is not about libtorch but rather how I can fix this through Tauri since cargo run seems to work I am wondering why yarn Tauri dev with the appropriate target doesn't work too. Is there a way to fix that through Tauri?
Please tell me if I should mention any additional details!
Reproduction
Here is my main.rs
// Prevents additional console window on Windows in release, DO NOT REMOVE!!
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
use rust_bert::pipelines::sentence_embeddings::{
SentenceEmbeddingsBuilder, SentenceEmbeddingsModelType,
};
extern crate redis;
use redis::Commands;
// Learn more about Tauri commands at https://tauri.app/v1/guides/features/command
#[tauri::command]
fn greet(name: &str) -> String {
format!("Hello, {}! You've been greeted from Rust!", name)
}
#[tauri::command]
fn index() -> String {
let client = redis::Client::open("redis://localhost/").expect("Failed to connect to Redis");
let mut con = client.get_connection().expect("Failed to establish connection with Redis");
let result: Option<String> = con.get("last_updated").expect("Failed to get value from Redis");
format!("Indexing... {}", result.unwrap_or_else(|| String::from("No value found for 'my_key'")))
}
#[tauri::command]
fn search(name: &str) -> Vec<String> {
// Set-up sentence embeddings model
let model = SentenceEmbeddingsBuilder::remote(SentenceEmbeddingsModelType::AllMiniLmL12V2)
.create_model()
.expect("Failed to create embeddings model");
let client = redis::Client::open("redis://localhost/").expect("Failed to connect to Redis");
let mut con = client.get_connection().expect("Failed to establish connection with Redis");
// Put query + all sentences in redis in the list to compute embeddings
let keys: Vec<String> = con.keys("*").expect("Failed to get all keys from Redis");
let mut sentences: Vec<&str> = vec![name];
sentences.extend(keys.iter().map(|s| s.as_str()));
// Generate embeddings
let embeddings = model
.encode(&sentences)
.expect("Failed to generate embeddings");
// Compute cosine distances
let query_embedding = &embeddings[0]; // Assuming the first embedding is the query
let mut distances: Vec<(f32, &str)> = Vec::new();
for (embedding, sentence) in embeddings.iter().skip(1).zip(sentences.iter().skip(1)) {
let distance = cosine_distance(query_embedding, embedding);
distances.push((distance, sentence));
}
// Sort by cosine distances in ascending order
distances.sort_by(|(distance1, _), (distance2, _)| distance2.partial_cmp(distance1).unwrap());
// Take the top 10 sentences based on distance in ascending order
let top_sentences: Vec<String> = distances.iter()
.take(10)
.map(|(_, sentence)| con.get::<&str, String>(&sentence).expect("Failed to get keys from Redis").to_string())
.collect();
top_sentences
}
fn dot_product(vec1: &[f32], vec2: &[f32]) -> f32 {
let delta = vec1.len() - vec2.len();
let shortest_vec = match delta {
d if d < 0 => vec1,
d if d > 0 => vec2,
_ => vec1,
};
let mut dot_product = 0.0;
for i in 0..shortest_vec.len() {
dot_product += vec1[i] * vec2[i];
}
dot_product
}
fn root_sum_square(vec: &[f32]) -> f32 {
let mut sum_square = 0.0;
for i in 0..vec.len() {
sum_square += vec[i] * vec[i];
}
sum_square.sqrt()
}
fn cosine_distance(vec1: &[f32], vec2: &[f32]) -> f32 {
let dot_product = dot_product(vec1, vec2);
let root_sum_square1 = root_sum_square(vec1);
let root_sum_square2 = root_sum_square(vec2);
dot_product / (root_sum_square1 * root_sum_square2)
}
fn main() {
tauri::Builder::default()
.invoke_handler(tauri::generate_handler![greet, index, search])
.run(tauri::generate_context!())
.expect("error while running tauri application");
}
This is the one causing problems. But it works fine when I run main.rs using cargo run --target x86_64-apple-darwin and replacing main() by:
fn main() {
let phrase = "nice looking";
let result = search(phrase);
for item in result {
println!("{}", item);
}
}
Expected behavior
I expected my tauri app to compile smoothly when using yarn tauri dev --target x86_64-apple-darwin but I feel like I am missing something and I am not totally sure to understand if I am doing it correctly. I am unsure of what yarn tauri dev --target x86_64-apple-darwin is actually doing...
Platform and versions
[✔] Environment
- OS: Mac OS 13.2.1 X64
✔ Xcode Command Line Tools: installed
✔ rustc: 1.67.1 (d5a82bbd2 2023-02-07)
✔ Cargo: 1.67.1 (8ecd4f20a 2023-01-10)
✔ rustup: 1.26.0 (5af9b9484 2023-04-05)
✔ Rust toolchain: stable-aarch64-apple-darwin (default)
- node: 20.2.0
- yarn: 1.22.19
- npm: 9.6.6
[-] Packages
- tauri [RUST]: 1.3.0
- tauri-build [RUST]: 1.3.0
- wry [RUST]: 0.24.3
- tao [RUST]: 0.16.2
- @tauri-apps/api [NPM]: 1.3.0
- @tauri-apps/cli [NPM]: 1.3.1
[-] App
- build-type: bundle
- CSP: unset
- distDir: ../dist
- devPath: http://localhost:1420/
- framework: Svelte
- bundler: Vite
Stack trace
No response
Additional context
MacOS M2
(thanks for the really extensive report!!)
I feel like this could be the same issue as https://github.com/tauri-apps/tauri/issues/6924 🤔
Hmm I see, he managed to solve it by adding to DLL to his resources if I understand correctly.
I digged a bit more into my problem and found out that the program cannot find the libtorch_cpu.dylib for the architecture I am asking for (it does have it for aarch64). Which is weird given that cargo run --target x86_64-apple-darwin works fine...
simconnect-sdk's build.rs normally links the DLL, and that works fine in typical cases like
cargo run. However, something in Tauri breaks that. I'm still trying to figure out what and why, but it's on my TODO list to investigate at some point.
I think that is were the real problem lies
I ended up finding a way to solve the problem by putting those dylib files in the directory they asked for. It is up to you if you want to keep this issue open.
@ssantoshp can you show detail way how to " putting those dylib files in the directory they asked for?" , I am building a ios target, i just block with the same error, when start it shows below error. thank you very much , if you can show me how you solve this~
Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: DYLD 1 Library missing Library not loaded: /Users/*/libchatgptenhanced.dylib Referenced from: <CDCA8CD5-E8E6-3776-9A34-87D893A8D9B2> /Volumes/VOLUME/*/chatgptenhanced.app/chatgptenhanced Reason: tried: '/Users/*/libchatgptenhanced.dylib' (no such file), '/private/preboot/Cryptexes/OS/Users/*/libchatgptenhanced.dylib' (no such file), '/Users/*/libchatgptenhanced.dylib' (no such file), '/usr/local/lib/libchatgptenhanced.dylib' (no such file), '/usr/lib/libchatgptenhanced.dylib' (no such file, not in dyld cache) (terminated at launch; ignore backtrace) Triggered by Thread: 0 Kernel Triage: VM - (arg = 0x0) pmap_enter retried due to resource shortage VM - (arg = 0x0) pmap_enter retried due to resource shortage