wormhole
wormhole copied to clipboard
wasm-gen: no prebuilt wasm-opt binaries are available for this platform: Unrecognized target! [Apple M1 chip MacOSX]
I am unable to build on the Apple M1 chip because wasm-opt
doesn't have pre-built binaries for the M1 chip and throws an "Unrecognized target" error. I ran this on non-Apple chips (still MacOS) and it seemed to work fine.
Here is the end of the log file of the wasm-gen container when run on Apple Silicon:
#19 112.6 Compiling wasm-bindgen-wasm-interpreter v0.2.76
#19 112.7 Compiling wasm-bindgen-threads-xform v0.2.76
#19 114.1 Compiling wasm-bindgen-cli-support v0.2.76
#19 126.7 Compiling wasm-bindgen-cli v0.2.76
#19 131.4 Finished release [optimized] target(s) in 1m 22s
#19 131.4 Installing /root/.cache/.wasm-pack/.wasm-bindgen-cargo-install-0.2.76/bin/wasm-bindgen
#19 131.4 Installing /root/.cache/.wasm-pack/.wasm-bindgen-cargo-install-0.2.76/bin/wasm-bindgen-test-runner
#19 131.4 Installing /root/.cache/.wasm-pack/.wasm-bindgen-cargo-install-0.2.76/bin/wasm2es6js
#19 131.4 Installed package `wasm-bindgen-cli v0.2.76` (executables `wasm-bindgen`, `wasm-bindgen-test-runner`, `wasm2es6js`)
#19 131.5 warning: be sure to add `/root/.cache/.wasm-pack/.wasm-bindgen-cargo-install-0.2.76/bin` to your PATH to be able to run the installed binaries
#19 131.6 Error: no prebuilt wasm-opt binaries are available for this platform: Unrecognized target!
#19 131.6 To disable `wasm-opt`, add `wasm-opt = false` to your package metadata in your `Cargo.toml`.
#19 ERROR: executor failed running [/bin/sh -c cd bridge/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm]: exit code: 1
------
> [build 12/21] RUN --mount=type=cache,target=/root/.cache --mount=type=cache,target=bridge/target cd bridge/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm:
------
executor failed running [/bin/sh -c cd bridge/program && /usr/local/cargo/bin/wasm-pack build --target bundler -d bundler -- --features wasm]: exit code: 1
Error: exit status 1
tilt docker build -- -f Dockerfile.wasm -o type=local,dest=.. . exited with exit code 1
Build Failed: Command "tilt docker build -- -f Dockerfile.wasm -o type=local,dest=.. ." failed: exit status 1
If I'm missing something obvious or have missed installing any dependency please let me know and I'm happy to retry with the correct configuration
as a possible workaround, I've considered disabling wasm-opt
but it wasn't clear exactly where to find the directory where I can update the Carge.toml
file since each directory had a Cargo.toml
file.
Please note that this works perfectly on non-Apple silicon on MacOS
Hi Nikhil. You have do disable all the wasm-opts in the Solana directory. Overall we don’t support running the local testnet on M1 at this time (you won’t be able to run Solana). Is there something specific you were looking to run?
Hi Evan, I'm looking to see how we can integrate Stellar into Wormhole and connect to the networks supported by Wormhole (Solana and Ethereum for now).
My plan was to run wormhole locally, get the necessary containers for the Stellar nodes up and running and connect it to guardian
by attempting to make some local code changes.
I'm unable to get guardian up and running in Tilt because it's dependent on solana, which is blocked on the wasm-opts
issue described here.
If I disable wasm-opts
and forego the solana connection in my local environment for now, would it be possible to connect to guardian and proceed with my attempt to try and integrate Stellar locally?
i.e. maybe I can just disable wasm-gen
and solana-devnet
containers in the Tilt configuration to move forward?
PS: if you have an alternative approach I can take, I'm all ears to suggestions
Yep! Totally possible. On my M1 Mac I basically do that do get me out of a pinch. Just comment out / delete all the Solana related stuff in the tilt file and everything should be fine (your guardian will have a warning). Hopefully when I have some spare time I can make a diff patch for M1 (or maybe you could help me out and post one here in the comments after you get things running 🙏!)
PS: if you have an alternative approach I can take, I'm all ears to suggestions
Deploy it to a x86 virtual machine, like a GCP or AWS instance. These instructions should work for Debian/Ubuntu and probably others as well: https://github.com/certusone/wormhole/blob/4f3ff8bb2ed561c872074fc6ddc51189424c6d34/DEVELOP.md#getting-started-on-a-development-vm
I figured out the issue and put a PR up here: https://github.com/rustwasm/wasm-pack/pull/1102.
Will have to wait for wasm-pack
to merge it upstream, but essentially the issue is when building the Dockerfile.wasm
on an M1 Mac, wasm-pack
treats the target as aarch64-linux
, which it currently doesn't handle, even though wasm-opt
does have pre-built binaries for aarch64-linux
.
If you want to run Wormhole anyways, you can modify this line https://github.com/certusone/wormhole/blob/aa0537284fca3f5881e8a773ace7d15401cc9f0f/solana/Dockerfile.wasm#L10 into
RUN cargo install wasm-pack --git https://github.com/Nasafato/wasm-pack --rev 3122e7760051904394da3f6f79903b3036403377
Though, now that building solana-devnet
can proceed, that build also errors out:
RUN make SOLANA=~/.local/share/solana/install/active_release/bin OUT_DIR=../target && \
cp ../target/oracle.so /opt/solana/deps/pyth_oracle.so
ENV RUST_LOG="solana_runtime::system_instruction_processor=trace,solana_runtime::message_processor=trace,solana_bpf_loader=debug,solana_rbpf=debug"
ENV RUST_BACKTRACE=1
Building image
resolve image config for docker.io/docker/dockerfile:1.2@sha256:e2a8561e419ab1ba6b2fe6cbdf49fd92b95912df1cf7d313c3e2230a333fdbcc
docker-image://docker.io/docker/dockerfile:1.2@sha256:e2a8561e419ab1ba6b2fe6cbdf49fd92b95912df1cf7d313c3e2230a333fdbcc
[stage-0 1/13] FROM docker.io/library/rust:1.49@sha256:a50165ea96983c21832578afb1c8c028674c965bc1ed43b607871b1f362e06a5
[stage-0 2/13] RUN apt-get update && apt-get install -y clang libssl-dev libudev-dev llvm pkg-config zlib1g-dev && rm -rf /var/lib/apt/lists/* && rustup component add rustfmt && rustup default nightly-2021-08-01 [cached]
[stage-0 3/13] RUN sh -c "$(curl -sSfL https://release.solana.com/v1.8.1/install)"
https://github.com/pyth-network/pyth-client/archive/31e3188bbf52ec1a25f71e4ab969378b27415b0a.tar.gz [done: 826ms]
→ downloading v1.8.1 installer
→ qemu-x86_64: Could not open '/lib64/ld-linux-x86-64.so.2': No such file or directory
ERROR IN: [stage-0 3/13] RUN sh -c "$(curl -sSfL https://release.solana.com/v1.8.1/install)"
Build Failed: ImageBuild: executor failed running [/bin/sh -c sh -c "$(curl -sSfL https://release.solana.com/v1.8.1/install)"]: exit code: 255
Which I believe is because it's building for aarch64-linux
again. I suspect this can be debugged and fixed as well, since many of us have Solana installed on our M1 Macs.
Issue is outdated (wasm is no longer used for the solana sdk.) Please reopen if necessary