exec: Exec format error
Hi,
I'm trying to make my jar executable, by prepending a launch script to it, like specified in the README of this project.
I've added the following lines to my build.sbt:
import sbtassembly.AssemblyPlugin.defaultUniversalScript
assemblyOption in assembly := (assemblyOption in assembly).value.copy(prependShellScript = Some(defaultUniversalScript(shebang = false)))
assemblyJarName in assembly := s"${name.value}-${version.value}"
Building it is fine, but actually trying to execute the generated script results in:
Failed to execute process './spark-benchmark-cli-0.1'. Reason:
exec: Exec format error
The file './spark-benchmark-cli-0.1' is marked as an executable but could not be run by the operating system.
I'm running centOS, if it is of any help. The output above was generated using the fish shell.
Using the bash shell I simply get:
Error: Invalid or corrupt jarfile ./spark-benchmark-cli-0.1
This probably happens because the file has no shebang header. Without the shebang, exec cannot determine which program should execute the file. Either set shebang to true or execute the file with bash or sh explicitly (sh ./spark-benchmark-cli-0.1)
@LolHens As described in the issue, I've already tried that. I get a invalid or corrupt jar file response using any of your suggested solutions.
Sorry I might have misread your post. I could reproduce your issue. Prepending defaultShellScript instead of defaultUniversalScript also didn't work. Infact even if I prepended just a newline to the jarfile it didn't work. With nothing prepended the jarfile starts just fine.
I also thought maybe the issue is with my java version so I tried to run another jarfile that had a script header but it worked just fine.
I can't tell what's wrong with your jar (maybe it is too big?) but it appears to be some weird edge case.