前提:
logstash 的 input 使用stdin的话,手工输入一条日志,logstash 可以接收并成功输出到Elasticsearch。
问题:
但logstash 的 input 使用beats后,filebeat 与 logstash 都有出错日志产生,分别如下:
filebeat 日志:
2018-12-08T21:44:11.636+0800 INFO log/harvester.go:254 Harvester started for file: /usr/local/filebeat-6.5.1-linux-x86_64/test-logs/test.log
2018-12-08T21:44:12.639+0800 ERROR logstash/async.go:256 Failed to publish events caused by: write tcp 192.168.239.131:48572->192.168.239.131:5016: write: connection reset by peer
2018-12-08T21:44:13.640+0800 ERROR pipeline/output.go:121 Failed to publish events: write tcp 192.168.239.131:48572->192.168.239.131:5016: write: connection reset by peer
2018-12-08T21:44:13.640+0800 INFO pipeline/output.go:95 Connecting to backoff(async(tcp://192.168.239.131:5016))
2018-12-08T21:44:13.641+0800 INFO pipeline/output.go:105 Connection to backoff(async(tcp://192.168.239.131:5016)) established
logstash 日志:
[2018-12-08T21:15:30,100][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"user_info", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x4cdfd79b>], :response=>{"index"=>{"_index"=>"user_info", "_type"=>"doc", "_id"=>"6ar1jWcBpEJbzJ4SSiho", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:210"}}}}}
配置信息如下:
logstash 配置:
input {
beats {
port => 5016
}
}
filter {
grok {
match => {"message" => "%{UUID:id} %{WORD:user_id} %{DATA:user_name} %{DATA:dept_name} %{INT:state} %{DATA:locate_name} %{TIMESTAMP_ISO8601:datetime}"}
}
date {
match => ["datetime","yyyy-MM-dd HH:mm:ss"]
}
}
output {
elasticsearch {
hosts => ["192.168.239.131:9200"]
manage_template => false
index => "user_info"
}
}
filebeat 配置:
- type: log
enabled: true
paths:
- /usr/local/filebeat-6.5.1-linux-x86_64/test-logs/test.log
tags: user_info
clean_*: true
output.logstash:
# The Logstash hosts
hosts: ["192.168.239.131:5016"]
logstash 的 input 使用stdin的话,手工输入一条日志,logstash 可以接收并成功输出到Elasticsearch。
问题:
但logstash 的 input 使用beats后,filebeat 与 logstash 都有出错日志产生,分别如下:
filebeat 日志:
2018-12-08T21:44:11.636+0800 INFO log/harvester.go:254 Harvester started for file: /usr/local/filebeat-6.5.1-linux-x86_64/test-logs/test.log
2018-12-08T21:44:12.639+0800 ERROR logstash/async.go:256 Failed to publish events caused by: write tcp 192.168.239.131:48572->192.168.239.131:5016: write: connection reset by peer
2018-12-08T21:44:13.640+0800 ERROR pipeline/output.go:121 Failed to publish events: write tcp 192.168.239.131:48572->192.168.239.131:5016: write: connection reset by peer
2018-12-08T21:44:13.640+0800 INFO pipeline/output.go:95 Connecting to backoff(async(tcp://192.168.239.131:5016))
2018-12-08T21:44:13.641+0800 INFO pipeline/output.go:105 Connection to backoff(async(tcp://192.168.239.131:5016)) established
logstash 日志:
[2018-12-08T21:15:30,100][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"user_info", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x4cdfd79b>], :response=>{"index"=>{"_index"=>"user_info", "_type"=>"doc", "_id"=>"6ar1jWcBpEJbzJ4SSiho", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:210"}}}}}
配置信息如下:
logstash 配置:
input {
beats {
port => 5016
}
}
filter {
grok {
match => {"message" => "%{UUID:id} %{WORD:user_id} %{DATA:user_name} %{DATA:dept_name} %{INT:state} %{DATA:locate_name} %{TIMESTAMP_ISO8601:datetime}"}
}
date {
match => ["datetime","yyyy-MM-dd HH:mm:ss"]
}
}
output {
elasticsearch {
hosts => ["192.168.239.131:9200"]
manage_template => false
index => "user_info"
}
}
filebeat 配置:
- type: log
enabled: true
paths:
- /usr/local/filebeat-6.5.1-linux-x86_64/test-logs/test.log
tags: user_info
clean_*: true
output.logstash:
# The Logstash hosts
hosts: ["192.168.239.131:5016"]
2 个回复
juin - 大数据开发
赞同来自:
medcl - 今晚打老虎。
赞同来自: