Hey Hey Hey


  • Home

  • Archives

  • Tags

  • Search

Ambari UI protect by SSL

Posted on 2017-07-10 |

https://docs.hortonworks.com/HDPDocuments/Ambari-2.5.1.0/bk_ambari-security/content/optional_set_up_ssl_for_ambari.html

Read more »

Kafka Env Debugging Tools

Posted on 2017-07-10 |

Nifi Security

https://docs.hortonworks.com/HDPDocuments/HDF3/HDF-3.0.0/bk_security/content/enabling-ssl-without-ca.html

Read more »

Kafka and Kafka Client communication protected by SASL

Posted on 2017-07-10 |

Kafka SASL configurations

Pre-requirements

Kafka SASL requires the Ambari Cluster to be Kerberized.

Enable SASL_PLAINTEXT

  • add below listeners to kafka listeners list,

    SASL_PLAINTEXT://localhost:6669

  • security.inter.broker.protocol=SASL_PLAINTEXT

    The default value is PLAINTEXTSASL (after kerberize wizard), should be changed to SASL_PLAINTEXT

    ???Should we change it to PLAINTEXT for performance?

Test the SASL_PLAINTEXT

Test from commandline

  • Turn on Ranger-Kafka Plugin
  • Check current user (make sure we are not in sudo command line)
1
2
3
4
5
6
7
klist
Ticket cache: FILE:/tmp/krb5cc_239506898_CkQLn0
Default principal: myusername@COMPANY.DOMAIN.COM

Valid starting Expires Service principal
17/08/17 09:31:23 17/08/17 19:31:23 krbtgt/COMPANY.DOMAIN.COM@COMPANY.DOMAIN.COM
renew until 24/08/17 09:31:23
  • in ranger, check the current user have access to the topic
Read more »

Kafka and Kafka Client communication protected by SSL

Posted on 2017-07-10 |

Enable SSL for Kafka and Kafka Client Communication

Scripts self-signed certificates and keystores and truststores

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
keytool -keystore kafka.server.keystore.jks -alias localhost -validity 365 -genkey -dname "CN=broker, OU=kafka" -keypass SuperTrust11 -storepass SuperTrust11
openssl req -new -x509 -keyout ca-key -out ca-cert -days 365 -passout pass:"SuperTrust11" -subj "/C=AU/ST=WA/L=Perth/O=kafka/CN=broker"

keytool -keystore kafka.server.truststore.jks -alias CARoot -import -file ca-cert -storepass SuperTrust11
keytool -keystore kafka.client.truststore.jks -alias CARoot -import -file ca-cert -storepass SuperTrust11

keytool -keystore kafka.server.keystore.jks -alias localhost -certreq -file cert-file
openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days 365 -CAcreateserial -passin pass:SuperTrust11

keytool -keystore kafka.server.keystore.jks -alias CARoot -import -file ca-cert -storepass SuperTrust11
keytool -keystore kafka.server.keystore.jks -alias localhost -import -file cert-signed -storepass SuperTrust11

keytool -keystore kafka.client.keystore.jks -alias localhost -validity 365 -genkey -dname "CN=client, OU=kafka" -keypass SuperTrust11 -storepass SuperTrust11


keytool -keystore kafka.client.keystore.jks -alias localhost -certreq -file client-cert-file -storepass SuperTrust11
openssl x509 -req -CA ca-cert -CAkey ca-key -in client-cert-file -out client-cert-signed -days 365 -CAcreateserial -passin pass:SuperTrust11
keytool -keystore kafka.client.keystore.jks -alias CARoot -import -file ca-cert -storepass SuperTrust11
keytool -keystore kafka.client.keystore.jks -alias localhost -import -file client-cert-signed -storepass SuperTrust11

mkdir -p /etc/security/certificates/kafka
cp /home/company.net/rachel/cert-kafka/kafka.server.*.jks /etc/security/certificates/kafka
chown -R kafka:kafka /etc/security/certificates/kafka
ls -l /etc/security/certificates/kafka

mkdir -p /etc/security/certificates/kafkaClient
cp /home/company.net/rachel/cert-kafka/kafka.client.*.jks /etc/security/certificates/kafkaClient
chown -R kafka:kafka /etc/security/certificates/kafkaClient
ls -l /etc/security/certificates/kafkaClient

Add below configuration to Kafka Config

1
2
3
4
5
ssl.keystore.location = /etc/security/certificates/kafka/kafka.server.keystore.jks
ssl.keystore.password = SuperTrust11
ssl.key.password = SuperTrust11
ssl.truststore.location = /etc/security/certificates/kafka/kafka.server.truststore.jks
ssl.truststore.password = SuperTrust11

Test one way SSL (Default)

Java Client connect to Kafka via SSL

1
2
3
4
5
6
7
8
9
//configure the following three settings for SSL Encryption
props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SSL");
props.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, "/opt/certificates/kafka/kafka.client.truststore.jks");
props.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, "SuperTrust11");

// configure the following three settings for SSL Authentication
props.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, "/opt/certificates/kafka/kafka.client.keystore.jks");
props.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, "SuperTrust11");
props.put(SslConfigs.SSL_KEY_PASSWORD_CONFIG, "SuperTrust11");

Command Line Client connect to Kafka server via SSL

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_security/content/ch_wire-kafka.html

Read more »

Nifi UI Authentication integrated with LDAP

Posted on 2017-07-10 |

Configure Nifi using Ldap to do Authentication to Web UI

Configure from Ambari Console

Ambari -> Nifi -> Configs -> Advanced nifi-login-identity-providers-env

using LDAP

  • if use LDAP the truststore and keystore can just use server’s keystore and truststore with no furthur configuration
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
<loginIdentityProviders>
<provider>
<identifier>ldap-provider</identifier>
<class>org.apache.nifi.ldap.LdapProvider</class>
<property name="Authentication Strategy">SIMPLE</property>
<property name="Manager DN">CN=TheUserUsedToConnectToAD,OU=IT Accounts,OU=Iron Ore,DC=hortonworks,DC=net</property>
<property name="Manager Password">PasswordForTheUserUsedToConnectToAD</property>
<property name="TLS - Keystore">/etc/security/certificates/nifi/nifi.server.keystore.jks</property>
<property name="TLS - Keystore Password">KeyStorePass</property>
<property name="TLS - Keystore Type">JKS</property>
<property name="TLS - Truststore">/etc/security/certificates/nifi/nifi.server.truststore.jks</property>
<property name="TLS - Truststore Password">TrustStorePass</property>
<property name="TLS - Truststore Type">JKS</property>
<property name="TLS - Client Auth"></property>
<property name="TLS - Protocol">TLS</property>
<property name="TLS - Shutdown Gracefully"></property>
<property name="Referral Strategy">FOLLOW</property>
<property name="Connect Timeout">10 secs</property>
<property name="Read Timeout">10 secs</property>
<property name="Url">ldap://ldap.ent.hortonworks.net:389</property>
<property name="User Search Base">OU=Iron Ore,DC=hortonworks,DC=net</property>
<property name="User Search Filter">sAMAccountName={0}</property>
<property name="Identity Strategy">USE_USERNAME</property>
<property name="Authentication Expiration">12 hours</property>
</provider>
</loginIdentityProviders>

using LDAPs

  • if use LDAPs, we should import AD’s certificates into server’s truststore
1
2
[root@nifi-server01 ~]# /usr/lib/jvm/java-1.8.0-oracle/bin/keytool -import -file /usr/hdf/current/ranger-usersync/conf/symantec-intermediate-ca.cer -alias symantec-intermediate-ca -keystore /etc/security/certificates/nifi/nifi.server.truststore.jks
[root@nifi-server01 ~]# /usr/lib/jvm/java-1.8.0-oracle/bin/keytool -import -file /usr/hdf/current/ranger-usersync/conf/symantec-root-ca.cer -alias symantec-root-ca -keystore /etc/security/certificates/nifi/nifi.server.truststore.jks
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
<loginIdentityProviders>
<provider>
<identifier>ldap-provider</identifier>
<class>org.apache.nifi.ldap.LdapProvider</class>
<property name="Authentication Strategy">LDAPS</property>
<property name="Manager DN">CN=TheUserUsedToConnectToAD,OU=IT Accounts,OU=Iron Ore,DC=hortonworks,DC=net</property>
<property name="Manager Password">PasswordForTheUserUsedToConnectToAD</property>
<property name="TLS - Keystore">/etc/security/certificates/nifi/nifi.server.keystore.jks</property>
<property name="TLS - Keystore Password">KeyStorePass</property>
<property name="TLS - Keystore Type">JKS</property>
<property name="TLS - Truststore">/etc/security/certificates/nifi/nifi.server.truststore.jks</property>
<property name="TLS - Truststore Password">TrustStorePass</property>
<property name="TLS - Truststore Type">JKS</property>
<property name="TLS - Client Auth"></property>
<property name="TLS - Protocol">TLS</property>
<property name="TLS - Shutdown Gracefully"></property>
<property name="Referral Strategy">FOLLOW</property>
<property name="Connect Timeout">10 secs</property>
<property name="Read Timeout">10 secs</property>
<property name="Url">ldaps://ldaps.ent.hortonworks.net:636</property>
<property name="User Search Base">OU=Iron Ore,DC=hortonworks,DC=net</property>
<property name="User Search Filter">sAMAccountName={0}</property>
<property name="Identity Strategy">USE_USERNAME</property>
<property name="Authentication Expiration">12 hours</property>
</provider>
</loginIdentityProviders>
Read more »

Ranger Sync user and group information from LDAP to used in Authentication

Posted on 2017-07-10 |

Ranger configuration

Configure Ranger to sync user/group from LDAP

connection Parameters

  • LDAP Url

    • ldaps://ldaps.hortonworks.net:636
  • Binding User (sample of distinguished name)

    • CN=ADMINUSERTOPULLUSER,OU=IT Accounts,DC=hortonworks,DC=net
  • Parameters for user sync configuration

  • User Attribute: sAMAccountName

  • User Object Class: person

  • User Search Base: OU=IT Accounts,DC=hortonworks,DC=net

  • User Search Filter: cn=*

  • User Search Scope: sub

  • User Group Name Attribute: memberof

  • Parameters for group sync configuration

  • Group Member Attribute: member

  • Group Name Attribute: cn

  • Group Object Name: group

  • Group Search Base: DC=hortonworks,DC=net

  • Group Search Filter: cn=*

Ranger Truststore configuration

As we are using LDAPS, we need to import the AD’s certificate into Ranger’s Truststore.

Check below configuration from Ambari admin console,

1
2
ranger.usersync.truststore.file
ranger.usersync.truststore.password

Make sure Truststore file exist and password is correct.

If you have existing trust store file, you can import the certification manually if needed.

Read more »

enable SSL for Ranger Web UI

Posted on 2017-07-10 |

Turn on Ranger Admin UI HTTPs using self-signed certificate

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_security/content/configure_ambari_ranger_ssl_self_signed_cert_admin.html

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
[root@ranger-server01 conf]# keytool -genkey -keyalg RSA -alias rangeradmin -keystore ranger-admin-keystore.jks -storepass xasecure -validity 360 -keysize 2048
What is your first and last name?
[Unknown]: ranger-server01.hortonworks.net
What is the name of your organizational unit?
[Unknown]: IT
What is the name of your organization?
[Unknown]: HORTONWORKS
What is the name of your City or Locality?
[Unknown]: Perth
What is the name of your State or Province?
[Unknown]: WA
What is the two-letter country code for this unit?
[Unknown]: 61
Is CN=ranger-server01.hortonworks.net, OU=IT, O=HORTONWORKS, L=Perth, ST=WA, C=61 correct?
[no]: yes

Enter key password for <rangeradmin>
(RETURN if same as keystore password):

Reset the environment for Ranger

If ranger configuration is wrong, we can

  1. backup the ranger database
  2. stop ranger
  3. drop the database, recreate, give corresponding PRIVILEGES
  4. start ranger
    The table will be automatically recreated and configured.
Read more »

Nifi Data migration

Posted on 2017-07-10 |

Nifi Data migration

Requirement, migrate existing Nifi flow into clustered environment without data loss.

https://community.hortonworks.com/questions/63745/migrating-nifi-flow-files-between-servers.html

Nifi Dump

nifiDump.sh

1
2
3
4
5
6
7
8
# get 10 thread dumps
for i in {1..10}
do
echo "start dump"
/usr/hdf/current/nifi/bin/nifi.sh dump /tmp/nifi_Dump_$(date +"%Y_%m_%d_%I_%M_%p")
echo "finished, sleep 60s"
sleep 60s
done
Read more »

Kafka Security

Posted on 2017-07-10 |

kafka Security

Reference Link

https://www.confluent.io/blog/apache-kafka-security-authorization-authentication-encryption/

For broker/ZooKeeper communication, we will only require Kerberos authentication as TLS is only supported in ZooKeeper 3.5, which is still at the alpha release stage.

Note current hortonworks zookeeper version in HDF3.0 is ZooKeeper 3.4.6.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
#!/bin/bash
PASSWORD=test1234
VALIDITY=365
# generate keystore for localhost ; valid for 365 days
keytool -keystore kafka.server.keystore.jks -alias localhost -validity $VALIDITY -genkey
# generate client keystore ; valid for 365 days
keytool -keystore kafka.client.keystore.jks -alias localhost -validity $VALIDITY -genkey

####### generate key store for server and client; valid for 365 days ######

# generate new X509 certificate with cert and keys ; valid for 365 days
openssl req -new -x509 -keyout ca-key -out ca-cert -days $VALIDITY


####### generate x509 CA certificate ; valid for 365 days #####

# generate truststore for server, trust ca-cert as CARoot
keytool -keystore kafka.server.truststore.jks -alias CARoot -import -file ca-cert
# generate truststore for client, trust ca-cert as CARoot
keytool -keystore kafka.client.truststore.jks -alias CARoot -import -file ca-cert

###### both server and client trust ca-cert by import the ca-cert into truststore #####


# generate a cert-file for server
keytool -keystore kafka.server.keystore.jks -alias localhost -certreq -file cert-file
# using ca-cert and password to sign server's cert-file ; valid for 365 days
openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days $VALIDITY -CAcreateserial -passin pass:$PASSWORD
# import ca-cert as CARoot into server's keystore
keytool -keystore kafka.server.keystore.jks -alias CARoot -import -file ca-cert
# import ca signed cert-signed into server's keystore
keytool -keystore kafka.server.keystore.jks -alias localhost -import -file cert-signed

###### using ca credential to sign certificate for server; import both ca-cert and ca-signed certificate into server keystore ######

# generate a cert-file for client
keytool -keystore kafka.client.keystore.jks -alias localhost -certreq -file cert-file
# using ca-cert and password to sign client's cert-file; valid for 365 days
openssl x509 -req -CA ca-cert -CAkey ca-key -in cert-file -out cert-signed -days $VALIDITY -CAcreateserial -passin pass:$PASSWORD
# import ca-cert as CARoot into client's keystore
keytool -keystore kafka.client.keystore.jks -alias CARoot -import -file ca-cert
# import ca signed cert-signed into client's keystore
keytool -keystore kafka.client.keystore.jks -alias localhost -import -file cert-signed


###### using ca credential to sign certificate for client; import both ca-cert and ca-signed certificate into server keystore ######
Read more »

Ambari SNMP Alert Setting

Posted on 2017-07-05 |

Ambari SNMP Alert Setting

Test Environment

Description

Prepare the SNMP Test Server

  • install snmp
1
yum install net-snmp net-snmp-utils net-snmp-libs –y
  • change authorization
1
2
3
# authCommunity   log,execute,net public
# traphandle SNMPv2-MIB::coldStart /usr/bin/bin/my_great_script cold
disableAuthorization yes
  • add Ambari MIB definition
    The current version of Ambari (2.4.2) does not contain MIB definition file. Manually copy the content from ambari jira
    https://issues.apache.org/jira/secure/attachment/12761892/APACHE-AMBARI-MIB.txt
1
2
vi /usr/share/snmp/mibs/APACHE-AMBARI-MIB.txt
chmod 777 /usr/share/snmp/mibs/APACHE-AMBARI-MIB.txt
Read more »
1…141516…18
Rachel Rui Liu

Rachel Rui Liu

178 posts
193 tags
RSS
GitHub Linkedin
© 2021 Rachel Rui Liu
Powered by Hexo
Theme - NexT.Pisces
0%