THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
Code Block |
---|
$ # sbt $ sbt/sbt assembly/assembly $ # Maven $ mvn package -DskipTests -pl assembly |
Building Spark in IntelliJ IDEA
Many Spark developers use IntelliJ for day-to-day development and testing in Spark. Importing Spark into IntelliJ requires a few special steps due to the complexity of the Spark build.
- Download IntelliJ and install the Scala plug-in for IntelliJ.
- Go to "File -> Import Project", locate the spark source directory, and select "Maven Project".
- Click through to the profiles selection, and select the following profiles: yarn, scala-2.10, hadoop-2.4, hive-thriftserver, hive-0.13.1. Click through to create the project.
- At the top of the leftmost pane, make sure the "Project/Packages" selector is set to "Packages".
- Right click on any package and click “Open Module Settings” - you will be able to modify any of the modules here.
- A few of the modules need to be modified slightly from the default import.
- Add sources to the following modules: Under “Sources” tab add a source on the right.
- spark-hive: add v0.13.1/src/main/scala
- spark-hive-thriftserver v0.13.1/src/main/scala
- spark-repl: scala-2.10/src/main/scala and scala-2.10/src/test/scala
- For spark-yarn click “Add content root” and navigate in the filesystem to yarn/common directory of Spark