1. Kerberos是一种计算机网络授权协议,用来在非安全网络中,对个人通信以安全的手段进行身份认证。具体请查阅官网
2. 需要安装的包(基于centos)
yum install libsasl2-dev
yum install gcc-c++ python-devel.x86_64 cyrus-sasl-devel.x86_64
yum install python-devel
yum install krb5-devel
yum install python-krbV
pip install krbcontext==0.9
pip install thrift==0.9.3
pip install thrift-sasl==0.2.1
pip install impyla==0.14.1
pip install hdfs[kerberos]
pip install pykerberos==1.2.1
3. /etc/krb5.conf 配置, 在这个文件里配置你服务器所在的域
4./etc/hosts 配置, 配置集群机器和域所在机器
5. 通过kinit 生成 ccache_file或者keytab_file
6. 连接hive代码如下
import os
from impala.dbapi import connect
from krbcontext import krbcontext
keytab_path = os.path.split(os.path.realpath(__file__))[0] + '/xxx.keytab'
principal = 'xxx'
with krbcontext(using_keytab=True,principal=principal,keytab_file=keytab_path):
conn = connect(host=ip, port=10000, auth_mechanism='GSSAPI', kerberos_service_name='hive')
cursor = conn.cursor()
cursor.execute('SELECT * FROM default.books')
for row in cursor:
print(row)
7. 连接hdfs代码如下
from hdfs.ext.kerberos import KerberosClient
from krbcontext import krbcontext
hdfs_url = 'http://' + host + ':' + port
data = self._get_keytab(sso_ticket)
self._save_keytab(data)
with krbcontext(using_keytab=True, keytab_file=self.keytab_file, principal=self.user):
self.client = KerberosClient(hdfs_url)
self.client._list_status(path).json()['FileStatuses']['FileStatus'] #获取path下文件及文件夹
8. 注:krbcontext这个包官方说支持python2,但是python3也能用
这个hdfs_url 一定要带"http://"不然会报错
9. 我新增了一些配置文件配置,具体的操作如下
python3.6.5基于kerberos认证的hdfs,hive连接调用(含基础环境配置)
1需要准备的环境
yum包(需要先装yum包,再装python包,不然会有问题)
yum install openldap-clients -y
yum install krb5-workstation krb5-libs -y
yum install gcc-c++ python-devel.x86_64 cyrus-sasl-devel.x86_64
yum install python-devel
yum install krb5-devel
yum install python-krbV
yum install cyrus-sasl-plain cyrus-sasl-devel cyrus-sasl-gssapi
python包安装(pip或pip3,请根据实际情况选择)
pip install krbcontext==0.9
pip install thrift==0.9.3
pip install thrift-sasl==0.2.1
pip install impyla==0.14.1
pip install hdfs[kerberos]
pip install pykerberos==1.2.1
配置/etc/hosts文件(需要把大数据平台的机器和域名进行配置)
10.xxx.xxx.xxx name-1 panel.test.com
10.xxx.xxx.xxx name-1
配置/etc/krb5.conf(具体查看kerberos服务配置中心)
参考配置(仅供参考,具体更具自己实际配置修改)
[libdefaults]
renew_lifetime = 9d
forwardable = true
default_realm = PANEL.COM
ticket_lifetime = 24h
dns_lookup_realm = false
dns_lookup_kdc = false
default_ccache_name = /tmp/krb5cc_%{uid}
[logging]
default = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind1.log
kdc = FILE:/var/log/krb5kdc1.log
[realms]
PANEL.COM = {
admin_server = panel.test1.com
kdc = panel.test1.com
}
连接代码:
hdfs:
import json, os
from hdfs.ext.kerberos import KerberosClient
from krbcontext import krbcontext
def _connect(self, host, port, sso_ticket=None):
try:
hdfs_url = 'http://' + host + ':' + port
active_str = 'kinit -kt {0} {1}'.format(self.keytab_file, self.user)
# 激活当前kerberos用户认证,因为python缓存机制,切换用户,这个缓存不会自动切换,需要手动处理下
os.system(active_str)
with krbcontext(using_keytab=True, keytab_file=self.keytab_file, principal=self.user):
self.client = KerberosClient(hdfs_url)
except Exception as e:
raise e
hive
import os
from krbcontext import krbcontext
from impala.dbapi import connect
from auto_model_platform.settings import config
def _connect(self, host, port, sso_ticket=None):
try:
active_str = 'kinit -kt {0} {1}'.format(self.keytab_file, self.user)
# 同hdfs
os.system(active_str)
with krbcontext(using_keytab=True, principal=self.user, keytab_file=self.keytab_file):
self.conn = connect(host=host, port=port, auth_mechanism='GSSAPI', kerberos_service_name='hive')
self.cursor = self.conn.cursor()
except Exception as e:
raise e
总结
我在做的时候也遇到很多坑,其实在这个需要理解其中原理,比如kerberos的机制和对应命令
如果是做基础平台用,用多用户切换的情况,建议不要用python,因为一点都不友好,官方包问题很多,我都改用java的jdbc去操作hdfs和hive了
如果只是自己测试和和做算法研究,还是可以用的,因为这个代码简单,容易实现
补充
kinit命令
kinit -kt xxxx.keytab #激活xxxx用户当前缓存
kinit list #查看当前缓存用户
以上这篇python3.6.5基于kerberos认证的hive和hdfs连接调用方式就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持软件开发网。
您可能感兴趣的文章:完美解决python针对hdfs上传和下载的问题python使用hdfs3模块对hdfs进行操作详解