filebeat使用mysql模块出错

作者 sailershen | 发布于2018年10月19日 | 阅读数:116

/etc/filebeat/filebeat.yml的内容:
filebeat.config.inputs:
enabled: true
path: configs/*.yml

filebeat.config.modules:
enabled: true
path: /etc/filebeat/modules.d/*.yml

filebeat.modules:
- module: system
- module: mysql

output.redis:
enabled: true
hosts: ["192.168.100.9:6379"]
port: 6379
datatype: list
db: 0
key: br-sh-db-1

/etc/filebeat/configs/mysql.yml
- module: mysql
error:
enabled: true
paths:
- /var/log/mysqld.log

启动filebeat,/var/log/filebeat/filebeat文件里的日志如下:
INFO    instance/beat.go:321    filebeat stopped.
ERROR instance/beat.go:691 Exiting: No paths were defined for input accessing '0' (source:'/etc/filebeat/configs/mysql.yml')
filebeat启动失败,官网文档上yml文件是这么写的,不知道问题在哪里。
https://www.elastic.co/guide/e ... .html
已邀请:

rochy - rochy_he@jointsky

赞同来自:

可以试一下下面的配置:
- module: mysql
error:
enabled: true
var.paths:
- /var/log/mysqld.log

官方的配置样例:
- module: mysql
error:
enabled: true
var.paths: ["/path/to/log/mysql/error.log*"]
slowlog:
enabled: true
var.paths: ["/path/to/log/mysql/mysql-slow.log*"]

sailershen

赞同来自:

谢谢回复。我用以下配置,filebeat启动没有出错:
/etc/filebeat/filebeat.yml的配置不变。
 
mysqldlog.yml的内容:
filebeat.inputs:
- module: mysql
error:
enabled: true
var.paths: ["/var/log/mysqld.log"]
slowlog:
enabled: true
var.paths: ["/var/log/mysql_slow.log"]

但是现在又有新的问题,在Kibana上看不到以上mysql日志。
 
ELK服务器上/etc/logstash/conf.d/mysql-log-input.conf文件的内容:
input {
redis {
host => "127.0.0.1"
port => 6379
data_type => "list"
key => "br-sh-db-1"
type => "redis-input"
db => 0
}
}

input {
redis {
host => "127.0.0.1"
port => 6379
data_type => "list"
key => "br-sh-db-2"
type => "redis-input"
db => 0
}
}

/etc/logstash/conf.d/elasticsearch_output.conf的内容:
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
action => "index"
codec => line
index => "br-logs"
}
}

ELK服务器上/var/log/logstash/logstash-plain.log里重复出现类似以下内容:
[2018-10-20T09:35:50,450][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"br-logs", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x31f1d2d1>], :response=>{"index"=>{"_index"=>"br-logs", "_type"=>"doc", "_id"=>"tNgdj2YBWv4odVdmPuyv", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:9"}}}}}}
[2018-10-20T09:35:51,454][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"br-logs", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x7f8beac6>], :response=>{"index"=>{"_index"=>"br-logs", "_type"=>"doc", "_id"=>"uNgdj2YBWv4odVdmQuyZ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:9"}}}}}

感觉是logstash上哪里配置错误,导致kibana里不显示mysql日志。

sailershen

赞同来自:

在社区里找到答案。
Logstash的配置文件里增加了filter
filter {
mutate {
rename => { "[host][name]" => "host" }
}
}

output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
action => "index"
codec => line
index => "br-logs"
}
}

问题就解决了。
 
还参考了官方文档:
https://www.elastic.co/guide/e ... .html

要回复问题请先登录注册