kafka動態許可權認證(SASL SCRAM + ACL)
kafka動態許可權認證(SASL SCRAM + ACL)
建立三個測試使用者
bin/kafka-configs.sh --zookeeper 192.168.x.x:2181 --alter --add-config ‘SCRAM-SHA-256=[iterations=8192,password=admin],SCRAM-SHA-512=[password=admin]’ --entity-type users --entity-name admin
PS:使用者 admin 這裡配置admin使用者用於實現broker間的通訊。
測試使用者 writer
bin/kafka-configs.sh --zookeeper 192.168.x.x:2181 --alter --add-config ‘SCRAM-SHA-256=[iterations=8192,password=writer],SCRAM-SHA-512=[password=writer]’ --entity-type users --entity-name writer
測試使用者 reader
bin/kafka-configs.sh --zookeeper 192.168.x.x:2181 --alter --add-config ‘SCRAM-SHA-256=[iterations=8192,password=reader],SCRAM-SHA-512=[password=reader]’ --entity-type users --entity-name reader
檢視建立使用者資訊
bin/kafka-configs.sh --zookeeper 192.168.2.6:2181 --describe --entity-type users (可以單獨指定某個使用者–entity-name writer)
建立配置檔案kafka-broker-jaas.conf
儲存至 /opt/kafka/config 下 (每臺主機)
KafkaServer {
org.apache.kafka.common.security.scram.ScramLoginModule required
username=“admin”
password=“admin”;
};
配置broker端的server.properties
啟用ACL
authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer
#設定本例中admin為超級使用者
super.users=User:admin
sasl.enabled.mechanisms=SCRAM-SHA-512
#為broker間通訊開啟SCRAM機制,採用SCRAM-SHA-512演算法
sasl.mechanism.inter.broker.protocol=SCRAM-SHA-512
#broker間通訊使用PLAINTEXT
security.inter.broker.protocol=SASL_PLAINTEXT
#配置listeners使用SASL_PLAINTEXT
listeners=SASL_PLAINTEXT://n6.aa-data.cn:9092(指定當前主機)
#配置advertised.listeners
advertised.listeners=SASL_PLAINTEXT://n6.aa-data.cn:9092(指定當前主機)
配置環境變數,引入jaas
export KAFKA_OPTS=’-Djava.security.auth.login.config=/opt/kafka/config/kafka-broker-jaas.conf’
生產
建立 producer.conf
security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=“writer” password=“writer”;
給writer 提供寫許可權
bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=192.168.2.6:2181 --add --allow-principal User:writer --operation Write --topic testcon
生產訊息
bin/kafka-console-producer.sh --broker-list n6.aa-data.cn:9092 --topic testcon --producer.config /opt/kafka/config/producer.conf
消費
建立 consumer.conf
security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=“reader” password=“reader”;
給reader 提供讀取許可權
bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=192.168.2.6:2181 --add --allow-principal User:reader --operation Read --topic testcon
給reader 新增訪問consumer group的許可權
bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=n6.aa-data.cn:2181 --add --allow-principal User:reader --operation Read --group test
消費 testcon訊息
bin/kafka-console-consumer.sh --bootstrap-server 192.168.2.6:9092 --topic testcon --from-beginning --consumer.config /opt/kafka/config/consumer.conf
動態新增 生成使用者
測試使用者 writer1
bin/kafka-configs.sh --zookeeper 192.168.2.6:2181 --alter --add-config ‘SCRAM-SHA-256=[iterations=8192,password=writer],SCRAM-SHA-512=[password=writer1]’ --entity-type users --entity-name writer1
測試使用者 reader1
bin/kafka-configs.sh --zookeeper 192.168.2.6:2181 --alter --add-config ‘SCRAM-SHA-256=[iterations=8192,password=reader],SCRAM-SHA-512=[password=reader1]’ --entity-type users --entity-name reader1
賦予許可權
bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=192.168.2.6:2181 --add --allow-principal User:writer1 --operation Write --topic testcon
bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=192.168.2.6:2181 --add --allow-principal User:reader1 --operation Read --topic testcon
bin/kafka-acls.sh --authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties zookeeper.connect=n6.aa-data.cn:2181 --add --allow-principal User:reader --operation Read --group test1
刪除原來的使用者
bin/kafka-configs.sh --zookeeper 192.168.2.6:2181 --alter --delete-config ‘SCRAM-SHA-256’ --entity-type users --entity-name writer
bin/kafka-configs.sh --zookeeper 192.168.2.6:2181 --alter --delete-config ‘SCRAM-SHA-512’ --entity-type users --entity-name writer
Ps:使用者刪除後 需修改 producer.conf consumer.conf