Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: IntelliJ update.

...

While many of the Spark developers use SBT or Maven on the command line, the most common IDE we use is IntelliJ IDEA. You can get the community edition for free (Apache committers can get free IntelliJ Ultimate Edition licenses) and install the JetBrains Scala plugin from Preferences > Plugins. To generate an IDEA workspace for Spark, run

Code Block
sbt/sbt update gen-idea

To create a Spark project for IntelliJ, simply check out the repository and use the import functionality in IntelliJ to import Spark as a Maven projectThen import the folder into IDEA. When you build the project, you might get a warning about "test and compile output paths" being the same for the "root-build" project. You can fix it by opening File -> Project Structure and changing the output path of the root-build module to be <spark-home>/project/target/idea-test-classes instead of idea-classes.

Eclipse

Eclipse can be used to develop and test Spark. The following configuration is known to work:

...