Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • "[test-maven]" - signals to test the pull request using maven
  • "[test-hadoop1.0]" - signals to test using Spark's Hadoop 1.0 profile (other options include Hadoop 2.0, 2.2, and 2.3)
Running Docker integration tests

In order to run Docker integration tests, you have to install docker engine on your box. The instructions for installation can be found at https://docs.docker.com/engine/installation/. Once installed, the docker service needs to be started, if not already running. On Linux, this can be done by sudo service docker start.
These integration tests run as a part of a regular Spark unit test run, therefore, it's necessary for the Docker engine to be installed and running if you want all Spark tests to pass. 

Organizing Imports

You can use a IntelliJ Imports Organizer from Aaron Davidson to help you organize the imports in your code.  It can be configured to match the import ordering from the style guide.

...