不为失败找理由,要为成功找方法。

RabbitMq通过logstash发送到es,显示不了message字段

Logstash | 作者 logstashHard | 发布于2017年07月26日 | 阅读数:6437

Rabbitmq中数据如下:
{"host":{"serviceName":"streamService2","address":"192.168.127.1","port":9983},"spans":[{"begin":1500877798555,"end":1500877798821,"name":"http:/service1","traceId":777698211216806359,"parents":[777698211216806359],"spanId":4799424895103297227,"remote":true,"exportable":true,"logs":[{"timestamp":1500877798557,"event":"sr"},{"timestamp":1500877798821,"event":"ss"}]},{"begin":1500877798556,"end":1500877798821,"name":"http:/service1","traceId":777698211216806359,"parents":[4799424895103297227],"spanId":-3738027106310948226,"exportable":true,"tags":{"http.url":"http://localhost:9983/service1","http.host":"localhost","http.path":"/service1","http.method":"GET"}}]}


 
 
 logstash配置如下:
input{
rabbitmq {
host => "10.129.41.82"
subscription_retry_interval_seconds => "5"
vhost => "/"
exchange => "sleuth"
queue => "sleuth.sleuth"
durable => "true"
key => "#"
user => "test"
password => "test"
}

}
filter{

json{

source => "message"
target => "mqdata"

}

}

output{
#stdout{
# codec => rubydebug
# }

elasticsearch {
hosts => "10.129.39.154"
index => "logstash-mq-%{+YYYY.MM.dd}"
document_type => "mq"
workers => 10

template_overwrite => true
}
}
最后在ES上显示效果如图:
01.jpg 02.jpg
已邀请:

logstashHard

赞同来自:

我后来直接把MQ中数据取出来,放到文件里面。然后用logstash读取文件,发送到ES中,结果如下:
logstash配置文件:
input{
file{
type => "boot"
path => "/home/elk/logstash-2.4.0/logtest/mq01.log"
}

}
filter{

json{

source => "message"
target => "mq"

}

}

output{
stdout{
codec => rubydebug
}

elasticsearch {
hosts => ["192.168.10.14:9200"]
index => "logstash-%{type}-%{+YYYY.MM.dd}"
document_type => "%{type}"
workers => 10
template_overwrite => true
}
}
最后在ES上展示效果如下:
 

logstashHard

赞同来自:

我一直弄不明白,为什么这个MQ里面的数据发送过去,没有message字段?

logstashHard

赞同来自:

我后来修改了logstash配置文件,
input{
rabbitmq {
host => "10.129.41.82"
subscription_retry_interval_seconds => "5"
vhost => "/"
exchange => "sleuth"
queue => "sleuth.sleuth"
durable => "true"
key => "#"
user => "test"
password => "test"
}

}
filter{



}

output{
#stdout{
# codec => rubydebug
# }

elasticsearch {
hosts => "10.129.39.154"
index => "logstash-mq-%{+YYYY.MM.dd}"
document_type => "mq"
workers => 10

template_overwrite => true
}
}
发现,还是什么用都没有,仍然是原格式,
{
"host":{
"address":"192.168.127.1",
"port":9983,
"serviceName":"streamService2"
},
"spans":[
{
"begin":1500877798555,
"end":1500877798821,
"exportable":true,
"logs":[
{
"event":"sr",
"timestamp":1500877798557
},
{
"event":"ss",
"timestamp":1500877798821
}
],
"name":"http:/service1",
"parents":[777698211216806359],
"remote":true,
"spanId":4799424895103297227,
"traceId":777698211216806359
},
{
"begin":1500877798556,
"end":1500877798821,
"exportable":true,
"name":"http:/service1",
"parents":[4799424895103297227],
"spanId":-3738027106310948226,
"tags":{
"http.host":"localhost",
"http.method":"GET",
"http.path":"/service1",
"http.url":"http://localhost:9983/service1"
},
"traceId":777698211216806359
}
]
}

logstashHard

赞同来自:

因为这个MQ里面的数据含有host,我不想覆盖原来自带的host,我想把MQ里面的数据全部都放到message里面。这样我能把它序列化为json。同时,还能实现后面es的分组聚合,
但是现在数据格式变化了,后面的一些处理瞬时感到没有头绪了。

medcl - 今晚打老虎。

赞同来自:

你的mq里面的每个消息就是一个json对象是么?
默认应该不用加JSONfilter就行了,你试试在RabbitMQ的input加上:
codec => "json" 呢?
 

要回复问题请先登录注册