不要急,总有办法的

logstash消费kafka导出到webhdfs出错,Kerberos验证失败

Logstash | 作者 sanshi123 | 发布于2021年04月07日 | 阅读数:2249

我logstash7.6.2、CDH6.3.2(开了Kerberos)、GSSAPI是1.3.1,今天用logstash消费kafka的数据,想导入hdfs中,读取kafka没问题,但是导入hdfs出错了,配置和报错信息如下,有路过的大佬帮忙看一下,感谢:

output {
    stdout { codec => rubydebug }
    webhdfs{
host => "10.16.1.17"
standby_host => "10.16.1.16"
port => 9870
standby_port => 9870
path => "/origin_data/test.log"
user => "czj@DEV.COM"
use_kerberos_auth => "true"
kerberos_keytab => "/etc/czj.keytab"
retry_interval => 30
codec => plain {
        format => "%{message}"
        }
    }
}
日志报错:
[2021-04-07T17:11:52,428][WARN ][logstash.outputs.webhdfs ][main] webhdfs write caused an exception: gss_init_sec_context did not return GSS_S_COMPLETE: Unspecified GSS failure.  Minor code may provide more information
Ticket expired
. Maybe you should increase retry_interval or reduce number of workers. Retrying...
已邀请:

要回复问题请先登录注册