logstash消费kafka时,如果当前链接的那台kafka挂掉,logstash不会重试别的kafka节点直接就停止了。
kafka版本2.10
logstash版本6.5.4
logstash配置:
kafka {
bootstrap_servers => ["172.20.213.105:9092,172.20.213.106:9092,172.20.213.107:9092"]
client_id => "stock-account-api"
topics => ["stock-account-api"]
codec => "plain"
auto_offset_reset => "latest"
consumer_threads => 1
decorate_events => true
type => "stock-account-api"
}
不知道啥原因,求大神指导!
kafka版本2.10
logstash版本6.5.4
logstash配置:
kafka {
bootstrap_servers => ["172.20.213.105:9092,172.20.213.106:9092,172.20.213.107:9092"]
client_id => "stock-account-api"
topics => ["stock-account-api"]
codec => "plain"
auto_offset_reset => "latest"
consumer_threads => 1
decorate_events => true
type => "stock-account-api"
}
不知道啥原因,求大神指导!
3 个回复
chencandong
赞同来自:
banna147 - 80后运维
赞同来自:
chencandong
赞同来自:
改成这个试试呢 bootstrap_servers => "172.20.213.105:9092,172.20.213.106:9092,172.20.213.107:9092"
6.5.4的logstash的bootstrap_servers的要求是string不是arry来的 不过看你跑好像也没有什么问题哦。。
具体文档https://www.elastic.co/cn/supp ... ugins