Kafka and Kafka Client communication protected by SASL
Kafka SASL configurations
Pre-requirements
Kafka SASL requires the Ambari Cluster to be Kerberized.
Enable SASL_PLAINTEXT
-
add below listeners to kafka listeners list,
SASL_PLAINTEXT://localhost:6669
-
security.inter.broker.protocol=SASL_PLAINTEXT
The default value is PLAINTEXTSASL (after kerberize wizard), should be changed to SASL_PLAINTEXT
???Should we change it to PLAINTEXT for performance?
Test the SASL_PLAINTEXT
Test from commandline
- Turn on Ranger-Kafka Plugin
- Check current user (make sure we are not in sudo command line)
1 | klist |
- in ranger, check the current user have access to the topic
Kafka and Kafka Client communication protected by SSL
Enable SSL for Kafka and Kafka Client Communication
Scripts self-signed certificates and keystores and truststores
1 | keytool -keystore kafka.server.keystore.jks -alias localhost -validity 365 -genkey -dname "CN=broker, OU=kafka" -keypass SuperTrust11 -storepass SuperTrust11 |
Add below configuration to Kafka Config
1 | ssl.keystore.location = /etc/security/certificates/kafka/kafka.server.keystore.jks |
Test one way SSL (Default)
Java Client connect to Kafka via SSL
1 | //configure the following three settings for SSL Encryption |
Command Line Client connect to Kafka server via SSL
https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_security/content/ch_wire-kafka.html
Nifi UI Authentication integrated with LDAP
Configure Nifi using Ldap to do Authentication to Web UI
Configure from Ambari Console
Ambari -> Nifi -> Configs -> Advanced nifi-login-identity-providers-env
using LDAP
- if use LDAP the truststore and keystore can just use server’s keystore and truststore with no furthur configuration
1 | <loginIdentityProviders> |
using LDAPs
- if use LDAPs, we should import AD’s certificates into server’s truststore
1 | [root@nifi-server01 ~]# /usr/lib/jvm/java-1.8.0-oracle/bin/keytool -import -file /usr/hdf/current/ranger-usersync/conf/symantec-intermediate-ca.cer -alias symantec-intermediate-ca -keystore /etc/security/certificates/nifi/nifi.server.truststore.jks |
1 | <loginIdentityProviders> |
Ranger Sync user and group information from LDAP to used in Authentication
Ranger configuration
Configure Ranger to sync user/group from LDAP
connection Parameters
-
LDAP Url
- ldaps://ldaps.hortonworks.net:636
-
Binding User (sample of distinguished name)
- CN=ADMINUSERTOPULLUSER,OU=IT Accounts,DC=hortonworks,DC=net
-
Parameters for user sync configuration
-
User Attribute: sAMAccountName
-
User Object Class: person
-
User Search Base: OU=IT Accounts,DC=hortonworks,DC=net
-
User Search Filter: cn=*
-
User Search Scope: sub
-
User Group Name Attribute: memberof
-
Parameters for group sync configuration
-
Group Member Attribute: member
-
Group Name Attribute: cn
-
Group Object Name: group
-
Group Search Base: DC=hortonworks,DC=net
-
Group Search Filter: cn=*
Ranger Truststore configuration
As we are using LDAPS, we need to import the AD’s certificate into Ranger’s Truststore.
Check below configuration from Ambari admin console,
1 | ranger.usersync.truststore.file |
Make sure Truststore file exist and password is correct.
If you have existing trust store file, you can import the certification manually if needed.
enable SSL for Ranger Web UI
Turn on Ranger Admin UI HTTPs using self-signed certificate
1 | [root@ranger-server01 conf]# keytool -genkey -keyalg RSA -alias rangeradmin -keystore ranger-admin-keystore.jks -storepass xasecure -validity 360 -keysize 2048 |
Reset the environment for Ranger
If ranger configuration is wrong, we can
- backup the ranger database
- stop ranger
- drop the database, recreate, give corresponding PRIVILEGES
- start ranger
The table will be automatically recreated and configured.
Nifi Data migration
Nifi Data migration
Requirement, migrate existing Nifi flow into clustered environment without data loss.
https://community.hortonworks.com/questions/63745/migrating-nifi-flow-files-between-servers.html
Nifi Dump
1 | # get 10 thread dumps |
Kafka Security
kafka Security
Reference Link
https://www.confluent.io/blog/apache-kafka-security-authorization-authentication-encryption/
For broker/ZooKeeper communication, we will only require Kerberos authentication as TLS is only supported in ZooKeeper 3.5, which is still at the alpha release stage.
Note current hortonworks zookeeper version in HDF3.0 is ZooKeeper 3.4.6.
1 | #!/bin/bash |
Ambari SNMP Alert Setting
Ambari SNMP Alert Setting
Test Environment
Description
Prepare the SNMP Test Server
- install snmp
1 | yum install net-snmp net-snmp-utils net-snmp-libs –y |
- change authorization
1 | # authCommunity log,execute,net public |
- add Ambari MIB definition
The current version of Ambari (2.4.2) does not contain MIB definition file. Manually copy the content from ambari jira
https://issues.apache.org/jira/secure/attachment/12761892/APACHE-AMBARI-MIB.txt
1 | vi /usr/share/snmp/mibs/APACHE-AMBARI-MIB.txt |