绊脚石乃是进身之阶。

elasticsearch接收不到logstash数据

Logstash | 作者 tokenquestion | 发布于2017年05月21日 | 阅读数:9877

elk版本都为5.4.0logstash配置
input {
beats {
port => 5044
}
}

output {
elasticsearch {
hosts => "localhost:9200"
index => "logstash-%{+YYYY.MM.dd}"
}
}
elasticsearch配置
cluster.name: my-application
node.name: node-1
path.data: E:\elk\elasticsearch-5.4.0\my-application\node-1\data
path.logs: E:\elk\elasticsearch-5.4.0\my-application\node-1\logs
已确定logstash可接收到filebeat发送过来的数据
请求:http://localhost:9200/_cat/indices?v
返回结果如下:
health status index   uuid                   pri rep docs.count docs.deleted store.size pri.store.size
yellow open .kibana KPY1ZwRPTaK_K-acB-yPBg 1 1 1 0 3.1kb 3.1kb
没有显示  logstash-*索引
根据官方文档描述,连接如下:
https://www.elastic.co/guide/e ... setup
Setting Up Logstashedit
In this setup, the Beat sends events to Logstash. Logstash receives these events by using the [b][i]Beats input plugin[/i][/b] [b][i]for Logstash[/i][/b] and then sends the transaction to Elasticsearch by using the [b][i]Elasticsearch output plugin[/i][/b] [b][i]for Logstash[/i][/b]. The Elasticsearch output plugin uses the bulk API, making indexing very efficient.
难道这两个插件必须安装吗?还是本来就自带了,本人小白,刚接触这方面东西,希望有人能给出解决思路,非常感谢!!!
已邀请:

tokenquestion

赞同来自: medcl

今天我又试了几遍,居然重现不了问题。。。。。
但是如果添加了x-pack插件的话,需要在elasticsearch.yml中添加添加一下属性
action.auto_create_index: .security,.monitoring*,.watches,.triggered_watches,.watcher-history*,filebeat*
filebeat* 是你自己在logstash配置文件中定义的,对于我的配置此处应替换为logstash-*
应为action.auto_create_index:属性的意思是上赋予logstash创建索引的权限
可以参考一下文章:
https://enginx.cn/2016/11/08/l ... .html

medcl - 今晚打老虎。

赞同来自:

filebeat配置贴一下

wyntergreg

赞同来自:

注意字段,假如你的数据里有type、index这种字段的话,logstash bulk的时候es会直接扔掉,没有错误提示的。

xiaodog

赞同来自:

我也是一样的情况。求大神支招!filebeat output至elasticsearch,没有问题。可以在kibana中查询显示。当filebeat output至logstash时,logstash能接收日志,日志如下,但是elasticsearch查询不到相关信息。
 
[2017-05-24T13:26:18,863][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2017-05-24T05:26:19.784Z, "offset"=>8913802, "@version"=>"1", "input_type"=>"log", "beat"=>{"hostname"=>"web1.nnkj.com", "name"=>"web1.nnkj.com", "version"=>"5.2.2"}, "host"=>"web1.nnkj.com", "source"=>"/var/servlogs/nginx/access_app.log", "message"=>"112.126.75.174 - - [24/May/2017:13:26:13 +0800] xxx.com GET \"/\" \"-\" 200 6325 \"-\" 200 10.24.235.19:8381 0.001 0.001 \"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0)\" \"-\"", "type"=>"nginx-access-handler", "tags"=>["beats_input_codec_plain_applied"]}}
[2017-05-24T13:26:18,864][DEBUG][logstash.filters.grok    ] Running grok filter {:event=>2017-05-24T05:26:19.784Z web1.nnkj.com 112.126.75.174 - - [24/May/2017:13:26:13 +0800] xxx.com GET "/" "-" 200 6325 "-" 200 10.24.235.19:8381 0.001 0.001 "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0)" "-"}
[2017-05-24T13:26:18,867][DEBUG][logstash.filters.grok    ] Event now:  {:event=>2017-05-24T05:26:19.784Z web1.nnkj.com 112.126.75.174 - - [24/May/2017:13:26:13 +0800] xxx.com GET "/" "-" 200 6325 "-" 200 10.24.235.19:8381 0.001 0.001 "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0)" "-"}
[2017-05-24T13:26:18,871][DEBUG][logstash.pipeline        ] filter received {"event"=>{"@timestamp"=>2017-05-24T05:26:19.784Z, "offset"=>8913942, "@version"=>"1", "input_type"=>"log", "beat"=>{"hostname"=>"web1.nnkj.com", "name"=>"web1.nnkj.com", "version"=>"5.2.2"}, "host"=>"web1.nnkj.com", "source"=>"/var/servlogs/nginx/access_app.log", "message"=>"106.15.14.60 - - [24/May/2017:13:26:15 +0800] xxx.com GET \"/\" \"-\" 200 6325 \"-\" 200 10.46.64.127:8382 0.001 0.001 \"Chrome/57\" \"-\"", "type"=>"nginx-access-handler", "tags"=>["beats_input_codec_plain_applied"]}}
 
 

要回复问题请先登录注册