You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Current »

Apache Kafka allows clients to authenticate using SASL/Kerberos.

1. Prerequisites

 1.1. Install Kerberos

 

For example on centos 6.4 you can use following commands to install kerberos.

sudo unzip -o -j -q /var/lib/ambari-server/resources/jce_policy-8.zip  -d /usr/jdk64/jdk1.8.0_40/jre/lib/security/
sudo yum install krb5-server krb5-libs krb5-auth-dialog -y
yum install rng-tools -y
sudo cp /vagrant/rngd /etc/sysconfig/rngd
sudo /etc/init.d/rngd start
sudo kdb5_util create -s
sudo /sbin/service krb5kdc start
sudo /sbin/service kadmin start

Note: on oracle java make sure you download the JCE policy files as showed in the first command


 1.2 Create Kerberos Principals

Always make sure the keytabs can only readable the service user and nobody else can read or access.


sudo /usr/sbin/kadmin.local -q 'addprinc -randkey kafka/hostname@domainname'
sudo /usr/sbin/kadmin.local -q "ktadd -k /etc/security/keytabs/kafka.keytab kafka/hostname@domainname"

Note: Make sure you replace hostname with FQDN of the machine where kafka broker is running. For each host one should create a keytab with principal.


 1.3 Make sure all hosts can be reachable using hostnames

Its important in case of kerberos all your hosts can be resolved with their FQDNs.


 1.4 Creating JAAS config file

Each node in the cluster should've following JAAS file . Add this file to kafka/config dir.

KafkaServer {
   com.sun.security.auth.module.Krb5LoginModule required
   useKeyTab=true
   storeKey=true
   serviceName="kafka"
    keyTab="/etc/security/keytabs/kafka1.keytab"
    principal="kafka/kafka1.hostname.com@DOMAIN.COM";
};

Client {
   com.sun.security.auth.module.Krb5LoginModule required
   useKeyTab=true
   storeKey=true
   serviceName="zookeeper"
   keyTab="/etc/security/keytabs/kafka1.keytab"
   principal="kafka/kafka1.hostname.com@DOMAIN.COM";
};

KafkaClient {
   com.sun.security.auth.module.Krb5LoginModule required
   useTicketCache=true
   serviceName="kafka";
};
  1. KafkaServer is a section name in JAAS file used by KafkaServer/Broker. This section tells Kafka Server which principal to use and which keytab this principal is stored. It allows Kafka Server to login using the keytab specified in this section.
  2. Client section is used to authenticate a SASL connection with zookeeper. It also allows a broker to set SASL ACL on zookeeper nodes which locks these nodes down so that only kafka broker can modify. It is important to have same principal name across all the brokers.
  3. KafkaClient section here describes how the clients like producer and consumer can connect to the Kafka Broker. Here we specified "useTicketCache=true" not a keytab this allows user to do kinit and run a kafka-console-consumer or kafka-console-producer to connect to broker. For a long running process one should create KafkaClient section similar to KafkaServer.
  4. In KafkaServer and KafkaClient sections we've "serviceName" this should match principal name with which kafka broker is running. In the above example principal="kafka/kafka1.hostname.com@DOMAIN.com" so we've "kafka" which is matching the principalName.

1.5 Creating Client side JAAS config

For a long running client create a keytab with its own principal name. 

For example:

sudo /usr/sbin/kadmin.local -q 'addprinc -randkey kafkaproducer/hostname@domainname'
sudo /usr/sbin/kadmin.local -q "ktadd -k /etc/security/keytabs/kafkaproducer.keytab kafkaproducer/hostname@domainname

Create following JAAS file

KafkaClient {
   com.sun.security.auth.module.Krb5LoginModule required
   useKeyTab=true
   storeKey=true
   serviceName="kafka"
    keyTab="/etc/security/keytabs/kafka1.keytab"
    principal="kafkaproducer/hostname@DOMAIN.COM";
};

 

2. Configuring Broker

  1. pass the jaas file from 1.4 section as a JVM parameter to the kafka broker. -Djava.security.auth.login.config=/etc/kafka/kafka_jaas.conf
  2. Make sure the keytabs configured in the kafka_jaas.conf are readable by the linux user who is starting kafka broker.

we need to configure the following property in server.properties, which must have one or more comma-separated values

listeners=SASL_PLAINTEXT://host.name:port

 

 

If you are only configuring SASL port than make sure you set same SASL protocol for inter-broker communication.

security.inter.broker.protocol=SASL_PLAINTEXT

 

3. Configuring Kafka Producer & Kafka Consumer

SASL authentication is only supported for new kafka producer and consumer, the older API is not supported. For Client side 1. Pass the jass file from 1.5 section as JVM parameter to the client JVM. -Djava.security.auth.login.config=/etc/kafka/kafka_client_jaas.conf 2. Make sure the keytabs configured in the kafka_client_jaas.conf are readable by the linux user who is starting kafka client.

configure the following property in producer.properties or consumer.properties

 

security.protocol=SASL_PLAINTEXT

 

 



 


  • No labels