llama
llama copied to clipboard
Having trouble launching llama on Linux
It compiled just fine, but when I try to run if fails out with a error complaining that it doesn't find libllamagui.so
, but it's right there
I suppose I have to put libllamagui.so some place in the filesystem, so where I have to put it?
Running it with cargo run
should work, or alternately you can use LD_LIBRARY_PATH=. ./llama-ui
. It's just because the solib isn't in the default search paths (e.g. /usr/lib
)
Nice!! :smile:
Now I'm getting a panic:
I included the backtrace in case it's useful
RUST_BACKTRACE=1 LD_LIBRARY_PATH=. ./llama-ui
thread 'main' panicked at 'called `Option::unwrap()` on a `None` value', llama-ui/main.rs:263:16
stack backtrace:
0: backtrace::backtrace::libunwind::trace
at /cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.46/src/backtrace/libunwind.rs:86
1: backtrace::backtrace::trace_unsynchronized
at /cargo/registry/src/github.com-1ecc6299db9ec823/backtrace-0.3.46/src/backtrace/mod.rs:66
2: std::sys_common::backtrace::_print_fmt
at src/libstd/sys_common/backtrace.rs:78
3: <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt
at src/libstd/sys_common/backtrace.rs:59
4: core::fmt::write
at src/libcore/fmt/mod.rs:1069
5: std::io::Write::write_fmt
at src/libstd/io/mod.rs:1504
6: std::sys_common::backtrace::_print
at src/libstd/sys_common/backtrace.rs:62
7: std::sys_common::backtrace::print
at src/libstd/sys_common/backtrace.rs:49
8: std::panicking::default_hook::{{closure}}
at src/libstd/panicking.rs:198
9: std::panicking::default_hook
at src/libstd/panicking.rs:218
10: std::panicking::rust_panic_with_hook
at src/libstd/panicking.rs:511
11: rust_begin_unwind
at src/libstd/panicking.rs:419
12: core::panicking::panic_fmt
at src/libcore/panicking.rs:111
13: core::panicking::panic
at src/libcore/panicking.rs:54
14: core::option::Option<T>::unwrap
at /home/grayjack/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/src/libcore/macros/mod.rs:34
15: llama_ui::main
at llama-ui/main.rs:263
16: std::rt::lang_start::{{closure}}
at /home/grayjack/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/src/rust/src/libstd/rt.rs:67
17: std::rt::lang_start_internal::{{closure}}::{{closure}}
at src/libstd/rt.rs:52
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
Oh!!! looks like it wanted a file as argument
Then I passed a crossbar9.firm
as parameter and it gives me another panic with a err message of Could not open file
/home/grayjack/.config/llama/sd.fat. What
sd.fat` has to be? a FAT filesystem image, a config file?
There are a bunch of files that llama looks for on boot:
- sd.fat, a FAT image of the SD card
- nand.bin, a NAND dump
- nand-cid.bin, a 16 byte file with the CID of the NAND, for decrypting it
- otp.bin, a dump of the OTP
Don't remember if it's optional or not, but also
- boot11.bin
- boot9.bin
This can be New 3DS ones or just Old 3DS?
I haven't tested anything from the New 3DS but if you supply them llama will be at least able to start running