THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
- Break your work into small, single-purpose patches if possible. It’s much harder to merge in a large change with a lot of disjoint features.
- Create a JIRA for your patch on the Spark Project JIRA.
- Submit the patch as a GitHub pull request. For a tutorial, see the GitHub guides on forking a repo and sending a pull request. Name your pull request with the JIRA name and include the Spark module or WIP if relevant, for example:
SPARK-123: Add some feature to Spark
[STREAMING] SPARK-123: Add some feature to Spark streaming
[MLLIB] [WIP] SPARK-123: Some potentially useful feature for MLLib
- Follow the Spark Code Style Guide.
- Make sure that your code passes the unit tests. You can run the tests with
sbt/sbt assembly
and thensbt/sbt test
in the root directory of Spark. It's important to runassembly
first as some of the tests depend on compiled JARs. - Add new unit tests for your code. We use ScalaTest for testing. Just add a new Suite in
core/src/test
, or methods to an existing Suite. - Update the documentation (in the
docs
folder) if you add a new feature or configuration parameter.
...