Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Code Block
$ # sbt
$ sbt/sbt assembly/assembly
$ # Maven
$ mvn package -DskipTests -pl assembly

Building Spark in IntelliJ IDEA

Many Spark developers use IntelliJ for day-to-day development and testing. Importing Spark into IntelliJ requires a few special steps due to the complexity of the Spark build.

  1. Download IntelliJ and install the Scala plug-in for IntelliJ.
  2. Go to "File -> Import Project", locate the spark source directory, and select "Maven Project".
  3. Click through to the profiles selection, and select the following profiles: yarn, scala-2.10, hadoop-2.4, hive-thriftserver, hive-0.13.1. Click through to create the project.
  4. At the top of the leftmost pane, make sure the "Project/Packages" selector is set to "Packages".
  5. Right click on any package and click “Open Module Settings” - you will be able to modify any of the modules here.
  6. A few of the modules need to be modified slightly from the default import.
    1. Add sources to the following modules: Under “Sources” tab add a source on the right. 
      1. spark-hive: add v0.13.1/src/main/scala
      2. spark-hive-thriftserver v0.13.1/src/main/scala
      3. spark-repl: scala-2.10/src/main/scala and scala-2.10/src/test/scala
    1. For spark-yarn click “Add content root” and navigate in the filesystem to yarn/common directory of Spark