vagrant-hadoop-spark-hive
vagrant-hadoop-spark-hive copied to clipboard
Version updates and minor bug fixes
I worked on bumping the Centos, Spark, Hadoop, and Java versions, And also corrected a few minor bugs, including JPS not installing, YARN not working with Java 8.
I get this error when I try to run vagrant up
(tried a few times):
==> node1: Setting hostname...
==> node1: Configuring and enabling network interfaces...
==> node1: Mounting shared folders...
node1: /vagrant => /Users/aholmes/me/src/me/vagrant-spark-hadoop
Failed to mount folders in Linux guest. This is usually because
the "vboxsf" file system is not available. Please verify that
the guest additions are properly installed in the guest and
can work properly. The command attempted was:
mount -t vboxsf -o uid=`id -u vagrant`,gid=`getent group vagrant | cut -d: -f3` vagrant /vagrant
mount -t vboxsf -o uid=`id -u vagrant`,gid=`id -g vagrant` vagrant /vagrant
The error output from the last command was:
mount: unknown filesystem type 'vboxsf'
The re-tested the master branch and vagrant up seems to work. Unfortunately I don't have much time to dig into why this PR yields this error.
Ah yes, sorry about that. I had the plugin vagrant-vbguest installed but neglected to change that in the Vagrantfile itself. I've now fixed this and verified that the addition of the line does install the plugin and that the plugin fixes the error.