posts - 496,comments - 227,trackbacks - 0
http://henning.kropponline.de/2016/02/21/secure-kafka-java-producer-with-kerberos/

the most recent release of kafka 0.9 with it’s comprehensive security implementation has reached an important milestone. in his blog post  ismael from confluent describes the security features part of the release very well.

as a part ii of the here published post about  this post discussed a sample implementation of a java kafka producer with authentication. it is part of a mini series of posts discussing , , and  (). in this effort at the end of this post we will also create a kafka servlet to publish messages to a secured broker.

kafka provides ssl and kerberos authentication. only kerberos is discussed here.

kafka from now on supports four different communication protocols between consumers, producers, and brokers. each protocol considers different security aspects, while plaintext is the old insecure communication protocol.

  • plaintext (non-authenticated, non-encrypted)
  • ssl (ssl authentication, encrypted)
  • plaintext sasl (authentication, non-encrypted)
  • ssl sasl (encrypted authentication, encrypted transport)

a kafka client needs to be configured to use the protocol of the corresponding broker. this tells the client to use authentication for communication with the broker:

java
1
2
properties props = new properties();
props.put("security.protocol", "plaintextsasl");

making use of kerberos authentication in java is provided by the  which is a pluggable authentication method similar to pam supporting multiple authentication methods. in this case the authentication method being used is .

demo setup

for jaas a proper configuration of gss would be needed in addition to being in possession of proper credentials, obviously. some credentials can be created with mit kerberos like this:

1
2
3
4
5
(as root)
$ kadmin.local -q "addprinc -pw hadoop kafka-user"
$ kadmin.local -q "xst -k /home/kafka-user/kafka-user.keytab kafka-user@mycorp.net"
(creating a keytab will make the existing password invalid. to change your password back to hadoop use as root:)
$ kadmin.local -q "cpw -pw hadoop hdfs-user"

the last line is not necessarily needed as it creates us a so called keytab – basically an encrypted password of the user – that can be used for password less authentication for example for automated services. we will make use of that here as well.

first we need to prepare a test topic to publish messages with proper privileges for our kafka-user:

1
2
3
4
5
6
7
8
9
10
11
# become kafka admin
$ kinit -kt /etc/security/keytabs/kafka.service.keytab kafka/one.hdp@mycorp.net
# set privileges for kafka-user
$ /usr/hdp/current/kafka-broker/bin/kafka-acls.sh --add --allow-principals user:kafka-user \
--operation all --topic test \
--authorizer-properties zookeeper.connect=one.hdp:2181
adding following acls for resource: topic:test
user:kafka-user has allow permission for operations: all from hosts: *
 
following is list of acls for resource: topic:test
user:kafka-user has allow permission for operations: all from hosts: *

as a sample producer we will use this:

java
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
package hdp.sample;
 
import java.util.date;
import java.util.properties;
import kafka.javaapi.producer.producer;
import kafka.producer.keyedmessage;
import kafka.producer.producerconfig;
 
public class kafkaproducer {
 
    public static void main(string... args) {
        string topic = args[1];
 
        properties props = new properties();
        props.put("metadata.broker.list", args[0]);
        props.put("serializer.class", "kafka.serializer.stringencoder");
        props.put("request.required.acks", "1");
        props.put("security.protocol", "plaintextsasl");
 
        producerconfig config = new producerconfig(props);
        producer producer = new producer<string, string>(config);
 
        for (int i = 0; i < 10; i){
            producer.send(new keyedmessage<string, string>(topic, "test date: " new date()));
        }
    }
}

with this setup we can go ahead demonstrating two ways to use a jaas context to authenticate with the kafka broker. at first we will configure a context to use the existing privileges possessed by the executing user. next we use a so called keytab to demonstrate a password-less login for automated producer processes. at last we will look at a servlet implementation provided here.

authentication with user login

to configure a jaas config with userkeytab set to false and useticketcache to true, so that the privileges of the current users are being used.

jaas-cache.conf
java
1
2
3
4
5
6
kafkaclient {
com.sun.security.auth.module.krb5loginmodule required
usekeytab=false
useticketcache=true
servicename="kafka";
};

we store this in a file under /home/kafka-user/kafka-jaas.conf and exeute the broker like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# list current user context
$ klist
ticket cache: file:/tmp/krb5cc_0
default principal: kafka-user@mycorp.net
 
valid starting       expires              service principal
21.02.2016 16:13:13  22.02.2016 16:13:13  krbtgt/mycorp.net@mycorp.net
 
# execute java producer
$ java -djava.security.auth.login.config=/home/kafka-user/kafka-jaas.conf \
-djava.security.krb5.conf=/etc/krb5.conf \
-djavax.security.auth.usesubjectcredsonly=false \
-cp hdp-kafka-sample-1.0-snapshot.jar:/usr/hdp/current/kafka-broker/libs/* \
hdp.sample.kafkaproducer one.hdp:6667 test
 
# consume sample messages for test
$ /usr/hdp/current/kafka-broker/bin/kafka-simple-consumer-shell.sh \
--broker-list one.hdp:6667 \
--topic test \
--security-protocol plaintextsasl \
--partition 0
{metadata.broker.list=one.hdp:6667, request.timeout.ms=1000, client.id=simpleconsumershell, security.protocol=plaintextsasl}
test date: sun feb 21 16:12:05 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016
test date: sun feb 21 16:12:06 utc 2016

using keytab to login

next we will configure the jaas context to use a generated keytab file instead of the security context of the executing user. before we can do this we need to create the keytab storing it also under /home/kafka-user/kafka-user.keytab.

1
2
3
4
5
6
7
8
9
10
$ kadmin.local -q "xst -k /home/kafka-user/kafka-user.keytab kafka-user@mycorp.net"
authenticating as principal kafka-user/admin@mycorp.net with password.
entry for principal kafka-user@mycorp.net with kvno 2, encryption type aes256-cts-hmac-sha1-96 added to keytab wrfile:/home/kafka-user/kafka-user.keytab.
entry for principal kafka-user@mycorp.net with kvno 2, encryption type aes128-cts-hmac-sha1-96 added to keytab wrfile:/home/kafka-user/kafka-user.keytab.
entry for principal kafka-user@mycorp.net with kvno 2, encryption type des3-cbc-sha1 added to keytab wrfile:/home/kafka-user/kafka-user.keytab.
entry for principal kafka-user@mycorp.net with kvno 2, encryption type arcfour-hmac added to keytab wrfile:/home/kafka-user/kafka-user.keytab.
entry for principal kafka-user@mycorp.net with kvno 2, encryption type des-hmac-sha1 added to keytab wrfile:/home/kafka-user/kafka-user.keytab.
entry for principal kafka-user@mycorp.net with kvno 2, encryption type des-cbc-md5 added to keytab wrfile:/home/kafka-user/kafka-user.keytab.
 
$ chown kafka-user. /home/kafka-user/kafka-user.keytab

the jaas configuration can now be changed to look like this:

1
2
3
4
5
6
7
8
9
10
kafkaclient {
com.sun.security.auth.module.krb5loginmodule required
donotprompt=true
useticketcache=true
principal="kafka-user@mycorp.net"
usekeytab=true
servicename="kafka"
keytab="/home/kafka-user/kafka-user.keytab"
client=true;
};

this will use the keytab stored under /home/kafka-user/kafka-user.keytab while the user executing the producer must not be logged in to any security controller:

1
2
3
4
5
6
7
8
$ klist
klist: credentials cache file '/tmp/krb5cc_0' not found
 
$ java -djava.security.auth.login.config=/home/kafka-user/kafka-jaas.conf \
-djava.security.krb5.conf=/etc/krb5.conf \
-djavax.security.auth.usesubjectcredsonly=true \
-cp hdp-kafka-sample-1.0-snapshot.jar:/usr/hdp/current/kafka-broker/libs/* \
hdp.sample.kafkaproducer one.hdp:6667 test

kafka producer servlet

in a last example we will add a kafka servlet to the project previously described in . our servlet will get the topic and message as a get parameter. the servlet looks as follwoing:

java
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
package hdp.webapp;
 
import java.io.ioexception;
import java.io.printwriter;
import java.util.properties;
import javax.servlet.servlet;
import javax.servlet.servletexception;
import javax.servlet.http.httpservlet;
import javax.servlet.http.httpservletrequest;
import javax.servlet.http.httpservletresponse;
 
import kafka.javaapi.producer.producer;
import kafka.producer.keyedmessage;
import kafka.producer.producerconfig;
 
 
public class kafkaservlet extends httpservlet implements servlet {
 
    protected void doget(httpservletrequest request, httpservletresponse response) throws servletexception, ioexception {
        string topic = request.getparameter("topic");
        string msg = request.getparameter("msg");
 
        properties props = new properties();
        props.put("metadata.broker.list", "one.hdp:6667");
        props.put("serializer.class", "kafka.serializer.stringencoder");
        props.put("request.required.acks", "1");
        props.put("security.protocol", "plaintextsasl");
 
        producerconfig config = new producerconfig(props);
        producer producer = new producer<string, string>(config);
 
        producer.send(new keyedmessage<string, string>(topic, msg));
 
        printwriter out = response.getwriter();
        out.println("");
        out.println("write to topic: "</span><span style="box-sizing: inherit; font-family: inherit; height: inherit; font-size: inherit !important; line-height: inherit !important; font-weight: inherit !important; color: #222222 !important;"> </span> <span style="box-sizing: inherit; font-family: inherit; height: inherit; font-size: inherit !important; line-height: inherit !important; font-weight: inherit !important; color: #222222 !important;">topic</span> <span style="box-sizing: inherit; font-family: inherit; height: inherit; font-size: inherit !important; line-height: inherit !important; font-weight: inherit !important; color: #222222 !important;"> </span><span style="box-sizing: inherit; font-family: inherit; height: inherit; font-size: inherit !important; line-height: inherit !important; font-weight: inherit !important; color: #006699 !important;">"");
        out.println("");
        out.println("");
        out.close();
 
    }
 
}

again we are changing the jaas config of the tomcat service to be able to make use of the previously generated keytab. the jaas.conf of tomcat will contain now this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
kafkaclient {
com.sun.security.auth.module.krb5loginmodule required
donotprompt=true
useticketcache=true
principal="kafka-user@mycorp.net"
usekeytab=true
servicename="kafka"
keytab="/home/kafka-user/kafka-user.keytab"
client=true;
};
com.sun.security.jgss.krb5.initiate {
    com.sun.security.auth.module.krb5loginmodule required
    donotprompt=true
    principal="tomcat/one.hdp@mycorp.net"
    usekeytab=true
    keytab="/etc/tomcat/tomcat.keytab"
    storekey=true;
};

after deploying the web app and restarting tomcat with this newly adapted jaas config you should be able to publish message to a secured broker be triggering the following get address from a browser http://one.hdp:8099/hdp-web/kafka?topic=test&msg=test1 . the response should be a 200 ok like this:

you might be having some issues and in particular seeing this exception:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
severe: servlet.service() for servlet [kafkaservlet] in context with path [/hdp-web] threw exception [servlet execution threw an exception] with root cause
javax.security.auth.login.loginexception: unable to obtain password from user
 
at com.sun.security.auth.module.krb5loginmodule.promptforpass(krb5loginmodule.java:897)
at com.sun.security.auth.module.krb5loginmodule.attemptauthentication(krb5loginmodule.java:760)
at com.sun.security.auth.module.krb5loginmodule.login(krb5loginmodule.java:617)
at sun.reflect.nativemethodaccessorimpl.invoke0(native method)
at sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62)
at sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)
at java.lang.reflect.method.invoke(method.java:497)
at javax.security.auth.login.logincontext.invoke(logincontext.java:755)
at javax.security.auth.login.logincontext.access$000(logincontext.java:195)
at javax.security.auth.login.logincontext$4.run(logincontext.java:682)
at javax.security.auth.login.logincontext$4.run(logincontext.java:680)
at java.security.accesscontroller.doprivileged(native method)
at javax.security.auth.login.logincontext.invokepriv(logincontext.java:680)
at javax.security.auth.login.logincontext.login(logincontext.java:587)
at org.apache.kafka.common.security.kerberos.login.login(login.java:298)
at org.apache.kafka.common.security.kerberos.login.<init>(login.java:104)
at kafka.common.security.loginmanager$.init(loginmanager.scala:36)
at kafka.producer.producer.<init>(producer.scala:50)
at kafka.producer.producer.<init>(producer.scala:73)
at kafka.javaapi.producer.producer.<init>(producer.scala:26)
at hdp.webapp.kafkaservlet.doget(kafkaservlet.java:33)
at javax.servlet.http.httpservlet.service(httpservlet.java:620)
at javax.servlet.http.httpservlet.service(httpservlet.java:727)
at org.apache.catalina.core.applicationfilterchain.internaldofilter(applicationfilterchain.java:303)
at org.apache.catalina.core.applicationfilterchain.dofilter(applicationfilterchain.java:208)
at org.apache.tomcat.websocket.server.wsfilter.dofilter(wsfilter.java:52)
at org.apache.catalina.core.applicationfilterchain.internaldofilter(applicationfilterchain.java:241)
at org.apache.catalina.core.applicationfilterchain.dofilter(applicationfilterchain.java:208)
at org.apache.catalina.core.standardwrappervalve.invoke(standardwrappervalve.java:220)
at org.apache.catalina.core.standardcontextvalve.invoke(standardcontextvalve.java:122)
at org.apache.catalina.authenticator.authenticatorbase.invoke(authenticatorbase.java:501)
at org.apache.catalina.core.standardhostvalve.invoke(standardhostvalve.java:171)
at org.apache.catalina.valves.errorreportvalve.invoke(errorreportvalve.java:102)
at org.apache.catalina.valves.accesslogvalve.invoke(accesslogvalve.java:950)
at org.apache.catalina.core.standardenginevalve.invoke(standardenginevalve.java:116)
at org.apache.catalina.connector.coyoteadapter.service(coyoteadapter.java:408)
at org.apache.coyote.http11.abstracthttp11processor.process(abstracthttp11processor.java:1040)
at org.apache.coyote.abstractprotocol$abstractconnectionhandler.process(abstractprotocol.java:607)
at org.apache.tomcat.util.net.jioendpoint$socketprocessor.run(jioendpoint.java:314)
at java.util.concurrent.threadpoolexecutor.runworker(threadpoolexecutor.java:1142)
at java.util.concurrent.threadpoolexecutor$worker.run(threadpoolexecutor.java:617)
at org.apache.tomcat.util.threads.taskthread$wrappingrunnable.run(taskthread.java:61)
at java.lang.thread.run(thread.java:745)

if are seeing the message javax.security.auth.login.loginexception: unableto obtain password from user it likely refers to your keytab file, as being the users password. so make sure that the tomcat user is able to read that file stored under /home/kafka-user/kafka-user.keytab for example.

further readings

    posted on 2016-07-05 11:41 simone 阅读(1409) 评论(0)  编辑  收藏

    只有注册用户后才能发表评论。


    网站导航:
                  
     
    "));
    网站地图