亲,只收二进制

logstash 给elasticsearch 发送自定义日志数据,出现错误:Could not index event to Elasticsearch...有大神知道问题所在吗?

Logstash | 作者 sun_changlong | 发布于2018年07月03日 | 阅读数:27183

通过filebeat 发送log日志到logstash ,logstash 做了见到的处理后发送给elasticsearch,中间报错:
logstash 端报错信息为:


[WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2018.07.03", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x791b1e45>], :response=>{"index"=>{"_index"=>"logstash-2018.07.03", "_type"=>"doc", "_id"=>"LL1XX2QBE8Yn7rwIb5s2", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:9"}}}}}
 


elasticsearch端报错信息为:
 


[DEBUG][o.e.a.b.TransportShardBulkAction] [logstash-2018.07.03][4] failed to execute bulk item (index) BulkShardRequest [[logstash-2018.07.03][4]] containing [index {[logstash-2018.07.03][doc][LL1XX2QBE8Yn7rwIb5s2], source[{"host":{"name":"server2"},"source":"/home/testlog/aads.log","beat":{"name":"server2","hostname":"server2","version":"6.3.0"},"tags":["beats_input_codec_plain_applied","_grokparsefailure"],"input":{"type":"log"},"prospector":{"type":"log"},"offset":688,"message":"d=ngtos version=1.0 time=\"2017-06-05 15:20:25\" dev=\"WAF01.PUB.BEIJING-B\" pri=\"0\" type=\"topwaf\" recorder=\"topwaf\" vsid=\"0\" client_ip=\"202.108.87.119\" sport=\"61939\" server_ip=\"123.125.127.27\" dport=\"80\" protocol=\"http\" server=\"涓叡涓ぎ鐩村睘5\" host=\"zzcg.ccgp.gov.cn\" url=\"/zzcg/lib/images/IndexLeftTitle.gif\" http_args=\"-\" http_method=\"GET\" http_useragent=\"Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; Tablet PC 2.0; UFAPP 1.0)\" http_status=\"304\" http_referer=\"http://zzcg.ccgp.gov.cn/zzcg/c ... 3.htm\" upstream=\"798\" downstream=\"033\"","@version":"1","@timestamp":"2018-07-03T08:51:45.797Z"}]}]
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [host]
at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:302) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:481) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:496) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:390) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:380) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.internalParseDocument(DocumentParser.java:95) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:69) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:261) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:700) ~[elasticsearch-6.3.0.jar:6.3.0]
at org.elasticsearch.index.shard.IndexShard.applyIndexOperation(IndexShard.java:677) ~[elasticsearch-6.3.0.jar:6.3.0]
.......


有哪位大神遇到或者知道原因的? 


 
已邀请:

sun_changlong

赞同来自:

主要原因是host内包含name引起的。暂时的解决办法就是添加配置进行过滤,
方法如下:
 mutate { 
  rename => { "[host][name]" => "host" } 
}
 
参考:https://discuss.elastic.co/t/d ... 36768
 

zqc0512 - andy zhou

赞同来自:

调整下写入的mapping也行吧。

liu61922190

赞同来自:

我也遇到此问题了,请问解决了吗?

sailershen

赞同来自:

我在logstash的output配置文件里增加了这一段,同样的问题也解决了。
filter {
mutate {
rename => { "[host][name]" => "host" }
}
}

要回复问题请先登录注册