angel
angel copied to clipboard
Unsupported major.minor version 52.0 when run spark on angel
how to define java8 when submit application use spark-submit
name: Bug report/Feature request/Question about: Create a report to help us improve title: '' label: bug/enhancement/question assignees: ''
Environment:
- Java version:
- Scala version:
- Spark version:
- PyTorch and Python version:
- OS and version:
Checklist:
- Did you check if your bug/feature/question is answered in the FQA ?
- Did you search issues to find if somebody discuss your bug/feature/question before?
- If your bug/question is about install, did you read this doc?
- If your bug/question is about parameter setting, did you read this doc?
Your Bug/Feature request/Question: Please describe bug/enhancement/question in detail.
- For bugs, please post the submit commands, error report logs and related code snippet
- For feature requests, please describe what's you scenario and why you need this feature. if you required feature is big please connect us email list.
使用spark on angel的时候怎么指定java版本,angel.am.env,angel.worker.env 没法用啊,spark参数识别不出来。
i meet the same problem when setting jdk version, angel.am.env,angel.worker.env seems to not work
you should set spark.hadoop prefix in spark conf, for example: spark.hadoop.angel.am.env, spark.hadoop.angel.worker.env