Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

 

Reducing Build Times

Spark's default build strategy is to assemble a jar including all of its dependencies. This can be cumbersome when doing iterative development. When developing locally, it is possible to create an assembly jar including all of Spark's dependencies and then re-package only Spark itself when making changes.

 

Code Block
languagebash
titleFast Local Builds
$ sbt/sbt clean assemble-deps
$ sbt/sbt package
# ... do some local development ... #
$ sbt/sbt package
# ... do some local development ... #
$ sbt/sbt package
# ...
 
# You can also use ~ to let sbt do incremental builds on file changes without running a new sbt session every time
$ sbt/sbt ~package

 Running Individual Tests

Often it is useful to run individual tests in Maven or SBT.

Code Block
$ # sbt
$ sbt/sbt "test-only org.apache.spark.io.ComprsionCodecSuite"
$ sbt/sbt "test-only org.apache.spark.io.*"

$ # Maven
$ mvn clean test -DwildcardSuites=org.apache.spark.io.ComprsionCodecSuite
$ mvn clean test -DwildcardSuites=org.apache.spark.io.*

Generating Dependency Graphs

Code Block
$ # sbt
$ sbt/sbt dependency-tree

$ # Maven
$ mvn -DskipTests install
$ mvn dependency:tree

Running Build Targets For Individual Projects

...

Moved permanently to http://spark.apache.org/developer-tools.html