Eclipse setup (without SBT eclipse plugin)
Install and prepare Eclipse
- Download and install Eclipse (Indigo or Juno is recommended);
- Install Scala IDE plugin as described here. Make sure to get the right bundle / update site according with your Eclipse version;
- Install IvyIDE plugin as described here. This will allow to automatically create classpath containers starting from ivy files;
Checkout Kafka source
- git clone http://git-wip-us.apache.org/repos/asf/kafka.git <kafka.project.dir>
Update the libraries and generate ivy configuration files
- cd <kafka.project.dir>
- ./sbt deliver-local
Create the Eclipse workspace
- Open Eclipse and create a new workspace
Probably Eclipse will prompt you about the upgrade of the Scala plugin. You can ignore that warning.
- Create a new Scala Project (File -> New -> Scala Project). This will start a new project wizard.
- Type a name for the project and uncheck the "Use default location". Next, press the browse button and select the <kafka.project.dir> created above.
If the selected directory is right, you will see something like “The wizard will automatically configure the JRE and the project layout..."
- After pressing the "Next" button, on the "Source" tab, right click on the src folder and remove it from build path. In the same tab, change the default output folder from bin to target/classes.
- If in the "Libraries" tab you don't have Scala Library (honestly I don't know why Juno doesn't that for you) press the "Add Library" button and select it.
- Close the wizard by pressing the "Finish" button
Sources and dependencies
At this point your Eclipse project doesn't have any source folder configured. The <kafka.project.dir> contains several sub-projects, a core and other related modules. Each of them has source folders and dependencies that you must add in order to compile and run tests. The core module is the most important because it's the broker itself; as consequence of that its section (below) is longer than the others
core
- Assuming you have the Package or Project Explorer view visible (if not Window → Show View → Other and select that view), select the source directories core/src/main/scala and core/src/test/scala/unit and add them to build path (right click -> Build Path -> Use as source folder);
- In the project tree (Project or Package explorer) browse and select /<eclipse.project.name>/core/target/scala-2.8.0/ivy-0.8-SNAPSHOT.xml right click and select “Add Ivy library”. Press Finish on the dialog that appears. This will add the project managed dependencies;
- In the project tree (Project or Package explorer) select all the jars under /<eclipse.project.name>/core/lib and add to build path (right click, Build path -> Add to build path). This will add the project unmanaged dependencies.
Open the project properties again, select Java build path: make sure that, in “Order and Export” tab, the zkclient-20120522.jar is in list before the generated ivy container entry.
Now you should have two source folders, several classpath containers and no compilation errors. If so, you can run unit tests by right clicking the core/src/scala/unit source folder and select Run as -> Scala JUnit Test
You can also run a Kafka instance directly within Eclipse. To do that, firstly you need two configuration files (server.properties and log4j.properties)
You already have those configuration files in your Eclipse, under the config directory. However, those are supposed to be used as template and are under version control, so if you are going to play with them, is better to make a copy elsewhere.
For running a broker:
- Open a shell and start a Zookeeper instance as described here.
- Create a new Run configuration by right clicking on the project, Run as -> Run configurations. The Run configurations dialog appears;
- Create a new Scala Application;
- On the "Main" tab the project name should be already populated with the name of your Eclipse project;
- On the same tab, insert into "Program Arguments" textarea the full absolute path to server.properties;
- On the same tab, insert into "VM Arguments" textarea the following system property: -Dlog4j.configuration=file://<full path of log4j.properties>
- Run the configuration.
You should see in Eclipse console the following output:
... INFO New leader is 0 (kafka.server.ZookeeperLeaderElector$LeaderChangeListener) INFO Registered broker 0 at path /brokers/ids/0 with address gxserver:9092. (kafka.utils.ZkUtils$) INFO [Kafka Server 0], started (kafka.server.KafkaServer)
contrib
The contrib module is a dependent module of kafka and therefore the ivy generared files contain a dependency that doens't exists in Ivy repository (the kafka jar). So first of all you need to edit those files (ivy-0.8-SNAPSHOT.xml)under the target directory of each contrib module and remove the following line:
<dependency org="org.apache" name="kafka_2.8.0" rev="0.8-SNAPSHOT" conf="compile->default(compile)"/>
- Assuming you have the Package or Project Explorer view visible (if not Window → Show View → Other and select that view), select the source directories contrib/hadoop-consumer/src/main/java and contrib/hadoop-producer/src/main/java and add them to build path (right click -> Build Path -> Use as source folder);
- In the project tree (Project or Package explorer) browse and select /<eclipse.project.name>/contrib/hadoop-consumer/target/scala-2.8.0/ivy-0.8-SNAPSHOT.xml right click and select “Add Ivy library”. Press Finish on the dialog that appears. This will add the project managed dependencies for both modules:
- In the project tree (Project or Package explorer) select all the jars under /<eclipse.project.name>/contrib/hadoop-consumer/lib and add to build path (right click, Build path -> Add to build path). This will add the project unmanaged dependencies for both modules.
examples and perf
If you want to setup those modules too, you can follow the same steps of the contrib module, they have a similar structure. One important difference is that examples and perf modules don't declare unmanaged dependencies.
Eclipse setup (with SBT eclipse plugin)
The following steps assume you completed the first two steps of the setup described above (install, prepare Eclipse and checkout Kafka source).
Generate Eclipse project files
- cd <kafka.project.dir>
- Edit the project/plugins.sbt file by adding the
sbteclipse-plugin
from Typesafe (last line in the snippet below). Once modified, the file should look like this:... addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.2.0") addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.1.1")
- ./sbt update
- Generate the eclipse projects with:
./sbt eclipse
. This command will create eclipse projects for every project defined in Kafka. You should have the following output on your console:[info] About to create Eclipse project files for your project(s). [info] Successfully created Eclipse project files for project(s): [info] kafka-perf [info] hadoop-consumer [info] kafka-java-examples [info] kafka [info] hadoop-producer
Create the Eclipse workspace
- Open Eclipse and create a new workspace;
- Import the generated project (File -> Import -> General -> Existing Projects into Workspace)
- Navigate to the <kafka.project.dir>. Eclipse will find the projects generated by the previous command;
- Select the projects you want to import;
You should see the projects you have imported. For running unit tests and Kafka broker refer to the previous section.
You will need regenerate the projects and refresh eclipse every time there is a change in the projects dependencies. In other words, every time you run ./sbt update
, you need to run ./sbt eclipse
and refresh eclipse.
Intellij Setup
Install and prepare IntelliJ
- Download and install IntelliJ;
- Install the IntelliJ IDEA Scala Plugin (Preferences -> Plugins -> Browse Repositories -> Search for Scala);
Checkout Kafka source
- git clone http://git-wip-us.apache.org/repos/asf/kafka.git <kafka.project.dir>
Update libraries and generate IntelliJ project files
- cd <kafka.project.dir>
- ./sbt update
- ./sbt idea
Create IntelliJ workspace
- Open Intellij and create new project pointing to <kafka.project.dir>