THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
Reducing Build Times
Spark's default build strategy is to assemble a jar including all of its dependencies. This can be cumbersome when doing iterative development. When developing locally, it is possible to create an assembly jar including all of Spark's dependencies and then re-package only Spark itself when making changes.
Code Block | ||||
---|---|---|---|---|
| ||||
$ sbt/sbt clean assemble deps
$ sbt/sbt package
# ... do some local development ... #
$ sbt/sbt package
# ... do some local development ... #
$ sbt/sbt package
# ...
# You can also use ~ to let sbt do incremental builds on file changes without running a new sbt session every time
$ sbt/sbt ~package |
Running Individual Tests
Often it is useful to run individual tests in Maven or SBT.
...
Moved permanently to http://spark.apache.org/developer-tools.html