spark-integration-tests
spark-integration-tests copied to clipboard
Integration tests for Spark
What are the versions supported? As of March 2017, it seems like the last commit was on Feb 2015. Is there any other Open Source effort out there to do...
In the readme you suggest `route -n add 172.17.0.0/16 boot2docker ip` to fix container name resolution problems from the host. So far I had no luck getting something similar up...
Following the instruction, Installing boot2docker v1.3.2 on OS X EI Capitan (version: 10.11.4) fails. Message from install.log install:didFailWithError:Error Domain=PKInstallErrorDomain Code=112 "An error occurred while running scripts from the package “Boot2Docker-1.3.2.pkg”."...
The current approach of adding `SPARK_HOME` as a SBT subproject makes it difficult to properly change Spark versions, since it makes it very easy to wind up using assembly JARs...
We should automatically check that container IP addresses are properly routed to the boot2docker VM before running tests. If this is misconfigured, then tests can mysteriously hang or fail, which...
We need to figure out how to run these tests on Linux in order to run them on EC2 / in Jenkins.
As the `NetworkFaultInjector` grows in complexity (I'm adding host container fault injection, for example), we should add standalone tests of the fault injector itself. These tests should only depend on...
Our tests should check that the required Docker images are installed before attempting to launch containers that use those images. I believe that Docker will automatically install missing images from...