我准备将es生成的快照保存到HDFS上,在自己搭建的不带kerberos认证的环境上没有问题,但是使用公司提供的带kerberos认证的HDFS服务却报权限错误
关键错误信息如下
keytab路径: /etc/elasticsearch/repository-hdfs/krb5.keytab
/etc/krb5.conf:
/ets/hosts:
已通过kinit命令验证了keytab文件以及principal 正确,并且hdfs上有/data/nta的权限
软件版本:
elasticsearch Version: 6.5.3, Build: default/rpm/159a78a/2018-12-06T20:11:28.826501Z, JVM: 1.8.0_201
hadoop version: Hadoop 2.6.0-cdh5.11.0
PUT _snapshot/my_hdfs_repository
{
"type": "hdfs",
"settings": {
"uri": "hdfs://10.149.164.225:8020",
"path": "/data/nta/repositories/my_hdfs_repository",
"security.principal": "sys_cq@HADOOP.XXX.COM"
}
}
关键错误信息如下
"reason": "Permission denied: user=sys_cq, access=WRITE, inode="/":hdfs:hadoop:drwxr-xr-x"
"status": 500
keytab路径: /etc/elasticsearch/repository-hdfs/krb5.keytab
/etc/krb5.conf:
includedir /etc/krb5.conf.d/
[logging]
default = FILE:/var/log/krb5libs.log
kdc = FILE:/var/log/krb5kdc.log
admin_server = FILE:/var/log/kadmind.log
[libdefaults]
dns_lookup_realm = false
ticket_lifetime = 24h
renew_lifetime = 7d
forwardable = true
rdns = false
default_realm = HADOOP.XXX.COM
[realms]
HADOOP.XXX.COM = {
kdc = hadoop-kdc01
kdc = hadoop-kdc02
admin_server = hadoop-kdc01
}
[domain_realm]
/ets/hosts:
10.149.164.30 hadoop-kdc01
10.149.164.40 hadoop-kdc02
已通过kinit命令验证了keytab文件以及principal 正确,并且hdfs上有/data/nta的权限
$ hadoop fs -ls -d /data/nta
drwxr-x--- - sys_cq sys_cq 0 2019-01-30 17:11 /data/nta
软件版本:
elasticsearch Version: 6.5.3, Build: default/rpm/159a78a/2018-12-06T20:11:28.826501Z, JVM: 1.8.0_201
hadoop version: Hadoop 2.6.0-cdh5.11.0
1 个回复
printf_uck - 1024
赞同来自:
"reason": "Permission denied: user=sys_cq, access=WRITE, inode="/":hdfs:hadoop:drwxr-xr-x" "status": 500
用户 sys_cq 对根目录“/”是没有写权限的,根目录的是hdfs:hadoop,两种方式可以解决该问题,
1、根目录赋权 777
2、将用户追加到Hadoop组内,不过这样的代价是该用户类似于超级用户了
以上两种方式可任选其一便可解决。