找到问题的解决办法了么?

filebeat录入数据到添加了SASL/PLAIN认证的kafka集群,前台执行报错

Beats | 作者 welkin | 发布于2021年11月16日 | 阅读数:1068

filebeat版本 6.4.3 
kafka版本1.0.0
kafka认证,server启动脚本添加参数参数如下
if [ "x$KAFKA_OPTS"  ]; then
    export KAFKA_OPTS="-Djava.security.auth.login.config=/opt/kafka/config/kafka_server_jaas.conf"
fi
 
jaas文件如下:
# cat vim /opt/kafka/config/kafka_server_jaas.conf 

KafkaServer {
    org.apache.kafka.common.security.plain.PlainLoginModule required
    username="kafka"
    password="kafkapswd"
    user_kafka="kafkapswd"
    user_mooc="moocpswd";
};
------------------------------------------

kafka认证,producer、consumer启动脚本添加参数参数如下
if [ "x$KAFKA_OPTS"  ]; then
    export KAFKA_OPTS="-Djava.security.auth.login.config=/opt/kafka/config/kafka_client_jaas.conf"
fi
producer、consumer的jaas文件如下:
# cat /opt/kafka/config/kafka_client_jaas.conf 
KafkaClient {
        org.apache.kafka.common.security.plain.PlainLoginModule required
        username="mooc"
        password="moocpswd";
};
------------------------------------------
filebeat.yml配置如下:
# grep -Ev '^$|#' filebeat.yml 
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /var/log/nginx/access.log
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 3
setup.kibana:
output.kafka:
  enabled: true
  hosts: ["192.168.56.10:8123","192.168.56.20:8123","192.168.56.30:8123"]
  version: "1.0.0"
  topic: 'nginx-test'
  user: 'mooc'
  password: 'moocpswd'
  max_message_bytes: 1000000
 
-----------------------------------------------
前台启动,报错如下:

2021-11-16T19:07:17.429+0800 INFO pipeline/output.go:95 Connecting to kafka(192.168.56.10:8123,192.168.56.20:8123,192.168.56.30:8123)
2021-11-16T19:07:17.429+0800 INFO kafka/log.go:53 kafka message: Initializing new client
2021-11-16T19:07:17.429+0800 INFO kafka/log.go:53 client/metadata fetching metadata for all topics from broker 192.168.56.10:8123
2021-11-16T19:07:17.431+0800 INFO kafka/log.go:53 Connected to broker at 192.168.56.10:8123 (unregistered)
2021-11-16T19:07:17.432+0800 INFO kafka/log.go:53 kafka message: client/metadata got error from broker while fetching metadata:%!(EXTRA *errors.errorString=EOF)
2021-11-16T19:07:17.433+0800 INFO kafka/log.go:53 Closed connection to broker 192.168.56.10:8123
2021-11-16T19:07:17.433+0800 INFO kafka/log.go:53 client/metadata fetching metadata for all topics from broker 192.168.56.30:8123
2021-11-16T19:07:17.433+0800 INFO kafka/log.go:53 Connected to broker at 192.168.56.30:8123 (unregistered)
。。。。

2021-11-16T19:07:17.959+0800 INFO kafka/log.go:53 kafka message: client/metadata no available broker to send metadata request to
2021-11-16T19:07:17.959+0800 INFO kafka/log.go:53 client/brokers resurrecting 3 dead seed brokers
2021-11-16T19:07:17.959+0800 INFO kafka/log.go:53 client/metadata retrying after 250ms... (1 attempts remaining)
 
已邀请:

welkin

赞同来自:

kafka,zookeeper,都是正常运行的;没有添加认证信息,可以正常消费

kin122

赞同来自:

从配置看,我这边和你的配置是差不多的。filebeat可以正常连上kafka,检查一下细节吧。

要回复问题请先登录注册