THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
Code Block | ||||
---|---|---|---|---|
| ||||
$ sbtbuild/sbt clean assembly # Create a normal assembly $ ./bin/spark-shell # Use spark with the normal assembly $ export SPARK_PREPEND_CLASSES=true $ ./bin/spark-shell # Now it's using compiled classes # ... do some local development ... # $ sbtbuild/sbt compile # ... do some local development ... # $ sbtbuild/sbt compile $ unset SPARK_PREPEND_CLASSES $ ./bin/spark-shell # Back to normal, using Spark classes from the assembly jar # You can also use ~ to let sbt do incremental builds on file changes without running a new sbt session every time $ sbtbuild/sbt ~compile |
Note | |||||||
---|---|---|---|---|---|---|---|
Note: in some earlier versions of Spark, fast local builds used a sbt task called
|
Checking Out Pull Requests
...
Code Block | ||
---|---|---|
| ||
# sbt $ sbtbuild/sbt "test-only org.apache.spark.io.CompressionCodecSuite" $ sbtbuild/sbt "test-only org.apache.spark.io.*" # Maven, run Scala test $ mvn test -DwildcardSuites=org.apache.spark.io.CompressionCodecSuite -Dtest=none $ mvn test -DwildcardSuites=org.apache.spark.io.* -Dtest=none # Maven, run Java test $ mvn test -DwildcardSuites=none -Dtest=org.apache.spark.streaming.JavaAPISuite |
Generating Dependency Graphs
Code Block |
---|
$ # sbt $ sbtbuild/sbt dependency-tree $ # Maven $ mvn -DskipTests install $ mvn dependency:tree |
Running Build Targets For Individual Projects
Code Block |
---|
$ # sbt $ sbtbuild/sbt assembly/assembly $ # Maven $ mvn package -DskipTests -pl assembly |