angel icon indicating copy to clipboard operation
angel copied to clipboard

Unsupported major.minor version 52.0 when run spark on angel

Open fennuzhichui opened this issue 5 years ago • 3 comments

how to define java8 when submit application use spark-submit


name: Bug report/Feature request/Question about: Create a report to help us improve title: '' label: bug/enhancement/question assignees: ''

Environment:

  • Java version:
  • Scala version:
  • Spark version:
  • PyTorch and Python version:
  • OS and version:

Checklist:

  • Did you check if your bug/feature/question is answered in the FQA ?
  • Did you search issues to find if somebody discuss your bug/feature/question before?
  • If your bug/question is about install, did you read this doc?
  • If your bug/question is about parameter setting, did you read this doc?

Your Bug/Feature request/Question: Please describe bug/enhancement/question in detail.

  • For bugs, please post the submit commands, error report logs and related code snippet
  • For feature requests, please describe what's you scenario and why you need this feature. if you required feature is big please connect us email list.

fennuzhichui avatar Jan 02 '20 10:01 fennuzhichui

使用spark on angel的时候怎么指定java版本,angel.am.env,angel.worker.env 没法用啊,spark参数识别不出来。

fennuzhichui avatar Jan 02 '20 10:01 fennuzhichui

i meet the same problem when setting jdk version, angel.am.env,angel.worker.env seems to not work

flyfoxCI avatar May 17 '20 23:05 flyfoxCI

you should set spark.hadoop prefix in spark conf, for example: spark.hadoop.angel.am.env, spark.hadoop.angel.worker.env

ouyangwen-it avatar Oct 29 '20 07:10 ouyangwen-it