Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Wiki Markup
h3. Eclipse setup (without SBT eclipse plugin)

h4. Install and prepare Eclipse

* Download and install Eclipse (Indigo or Juno is recommended);
* Install Scala IDE plugin as described [here|http://scala-ide.org/download/current.html]. Make sure to get the right bundle / update site according with your Eclipse version;
* Install IvyIDE plugin as described [here|http://ant.apache.org/ivy/ivyde/download.cgi]. This will allow to automatically create classpath containers starting from ivy files;

h4. Checkout Kafka source

* git clone [

Eclipse Setup

...

http://git-wip-us.apache.org/repos/asf/kafka.git|http://git

...

  1. ./sbt update

...

-wip-us.apache.org/repos/asf/kafka.git] _<kafka.project.dir>_

h4. Update the libraries and generate ivy configuration files

* cd _<kafka.project.dir>_
* ./sbt deliver-local

h4. Create the Eclipse workspace

* Open Eclipse and create a new workspace

{info}

Probably Eclipse will prompt you about the upgrade of the Scala plugin. You can ignore that warning.
{info}
* Create a new Scala Project (File \-> New \-> Scala Project). This will start a new project wizard.
* Type a name for the project and uncheck the "Use default location". Next, press the browse button and select the _<kafka.project.dir>_ created above.

{info}
If the selected directory is right, you will see something like “_The wizard will automatically configure the JRE and the project layout..._"
{info}
* After pressing the "Next" button, on the "Source" tab, right click on the src folder and remove it from build path. In the same tab, change the default output folder from _bin_ to _target/classes._
* If in the "Libraries" tab you don't have Scala Library (honestly I don't know why Juno doesn't that for you) press the "Add Library" button and select it.
* Close the wizard by pressing the "Finish" button

h4. Sources and dependencies

At this point your Eclipse project&nbsp; doesn't have any source folder configured. The _<kafka.project.dir>_ contains several sub-projects, a core and other related modules. Each of them has source folders and dependencies that you must add in order to compile and run tests. The core module is the most important because it's the broker itself; as consequence of that its section (below) is longer than the others


h5. core

* Assuming you have the Package or Project Explorer view visible (if not Window → Show View → Other and select that view), select the source directories _core/src/main/scala_ and _core/src/test/scala/unit

...

  1. I had to also add the 2.9.1 scala library by choosing Add External Jar and choosing that library

You should be able to run unit tests or the kafka server or other command line apps through eclipse for debugging. To do this right-click the class you want to run and select

    1. Run as...<Unit Test|Scala Application>
    2. By default log4j will not be configured correctly. To fix this go to your run target and under Classpath choose Advanced... and then Add Folders, and the folder config/

Eclipse Setup With SBT Eclipse Plugin

The following steps assume you have the kafka source and installed the Eclipse Scala IDE, as stated in the Eclipse Setup section above.

...

_ and add them to build path (right click \-> Build Path \-> Use as source folder);
* In the project tree (Project or Package explorer) browse and select&nbsp; _/<eclipse.project.name>/core/target/scala-2.8.0/ivy-0.8-SNAPSHOT.xml_ right click and select “Add Ivy library”. Press Finish on the dialog that appears. This will add the project managed dependencies;
* In the project tree (Project or Package explorer) select all the jars under _/<eclipse.project.name>/core/lib_ and add to build path (right click, Build path \-> Add to build path). This will add the project unmanaged dependencies.

{note}
Open the project properties again, select Java build path: make sure that, in “Order and Export” tab, the _zkclient-20120522.jar_ is in list before the generated ivy container entry.
{note}

Now you should have two source folders, several classpath containers and no compilation errors. If so, you can run unit tests by right clicking the _core/src/scala/unit_ source folder and select Run as \-> Scala JUnit Test

You can also run a Kafka instance directly within Eclipse. To do that, firstly you need two configuration files (_server.properties_ and _log4j.properties_)
{note}

You already have those configuration files in your Eclipse, under the _config_ directory. However, those are supposed to be used as template and are under version control, so if you are going to play with them, is better to make a copy elsewhere.


{note}

For running a broker:

* Open a shell and start a Zookeeper instance as described [here|http://kafka.apache.org/08/quickstart.html].
* Create a new Run configuration by right clicking on the project, Run as \-> Run configurations. The Run configurations dialog appears;
* Create a new Scala Application;
* On the "Main" tab the project name should be already populated with the name of your Eclipse project;
* On the same tab, insert into "Program Arguments" textarea the full absolute path to server.properties;
* On the same tab, insert into "VM Arguments" textarea the following system property: _\-Dlog4j.configuration=__[file://]__<full path of log4j.properties>_
* Run the configuration.

You should see in Eclipse console the following output:

{code}
...
INFO New leader is 0 (kafka.server.ZookeeperLeaderElector$LeaderChangeListener)
INFO Registered broker 0 at path /brokers/ids/0 with address gxserver:9092. (kafka.utils.ZkUtils$)
INFO [Kafka Server 0], started (kafka.server.KafkaServer)
{code}

h5. contrib

The contrib module is a dependent module of kafka and therefore the ivy generared files contain a dependency that doens't exists in Ivy repository (the kafka jar). So first of all you need to edit those files (ivy-0.8-SNAPSHOT.xml)under the target directory of each contrib module and remove the following line:

{code}
<dependency org="org.apache" name="kafka_2.8.0" rev="0.8-SNAPSHOT" conf="compile->default(compile)"/>
{code}

* Assuming you have the Package or Project Explorer view visible (if not  Window → Show View → Other and select that view), select the source  directories _contrib/hadoop-consumer/src/main/java_ and&nbsp;_contrib/hadoop-producer/src/main/java_ and add them to build path (right click \-> Build Path \-> Use as source folder);
* In the project tree (Project or Package explorer) browse and select&nbsp; _/<eclipse.project.name>/contrib/hadoop-consumer/target/scala-2.8.0/ivy-0.8-SNAPSHOT.xml_ right click and select “Add Ivy library”. Press Finish on the dialog  that appears. This will add the project managed dependencies for both modules:
* In the project tree (Project or Package explorer) select all the jars  under _/<eclipse.project.name>/contrib/hadoop-consumer/lib_ and add to build path  (right click, Build path \-> Add to build path). This will add the  project unmanaged dependencies for both modules.

h5. examples and perf

If you want to setup those modules too, you can follow the same steps of the contrib module, they have a similar structure. One important difference is that examples and perf modules don't declare unmanaged dependencies.

h3. Eclipse setup (with SBT plugin)

The following steps assume you have the kafka source and installed the Eclipse Scala IDE, as stated in the Eclipse Setup section above.

# edit the project/plugins.sbt file by adding the {{sbteclipse-plugin}} from Typesafe (last line in the snippet below). Once modified, the file should look like this:
{code}
resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.8.5")

addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.2.0")

addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.1.1")

...


{code}
# run: {{./sbt update

...

}}
# generate the eclipse projects with: {{./sbt eclipse}}. This command will create eclipse projects for every project defined in Kafka. You should have the following output on your console:

...


{code

...

}
[info] About to create Eclipse project files for your project(s).
[info] Successfully created Eclipse project files for project(s):
[info] kafka-perf
[info] hadoop-consumer
[info] kafka-java-examples
[info] kafka
[info] hadoop-producer

...


{code}
# In eclipse, use Import \-> General \-> Existing Projects into Workspace

...


## navigate to the kafka source directory, it should find the projects generated by the previous command.

...


## select the projects you want to import and click Finish

...


## you should see the projects you have imported

...



Once you have your projects in place, you will be able to run/debug any of the applications from eclipse.

...

Info

You will need regenerate the projects and refresh eclipse every time there is a change in the projects dependencies. In other words, every time you run ./sbt update, you need to run ./sbt eclipse and refresh eclipse.

Intellij Setup

...



{info}
You will need regenerate the projects and refresh eclipse every time there is a change in the projects dependencies. In other words, every time you run {{./sbt update}}, you need to run {{./sbt eclipse}} and refresh eclipse.
{info}



h3. Intellij Setup


# Checkout kafka source
## git clone [http://git-wip-us.apache.org/repos/asf/kafka.git] kafka

...


# Update the libraries:

...


## ./sbt update

...


# Create IDEA project files:

...


## ./sbt

...


## idea
# Install the IntelliJ IDEA Scala Plugin [Preferences &#45;> Plugins &#45;> Browse Repositories &#45;> Search for Scala&#124;&#124;&#124;&#124;&#124;&#124;&#124;&#124;&#124;&#124;&#124;&#124;&#124;&#124;\||]
# Open Itellij, Open new project, point to your kafka source location.<\!-\- P { margin-bottom: 0.08in; }A:link {  } \-->{color:#000000}Make sure to get the right bundle / update site according with your Eclipse version.{color}

h3. setup with SBT plugin