Shivaram Venkataraman
Shivaram Venkataraman
I think it would be great if we could change the default to not be the failure case -- Can you send a PR changing the default hadoop version to...
I think we should support it. Please open a PR if you have a chance to add support for this.
I dont think spark-ec2 is designed to run with the client on your machine and a cluster on EC2. Its designed to run with the client on the master machine...
@wigor would you be interested in sending a PR for this ?
Thanks for the PR. I'm not sure this is the right fix. I think the right fix is to just remove the thing about ec2 dir in Spark and just...
I think Tachyon 0.8.2 is very old and the new versions of Alluxio have their own setup which is incompatible with existing spark-ec2 code. I am not sure this change...
@dashcode Thanks for the PR. However I think for using S3a we need a hadoop-aws-sdk package which needs to be installed separately from HDFS ? Or in other words did...
I think this is because the Amazon VPC cluster might not have access to the internet.
Yeah it looks like that file is missing -- @JoshRosen can you help in uploading `spark-1.6.2-bin-hadoop1.tgz` to the S3 bucket `spark-related-packages` ?