使用 shuf 来打乱一个文件中的行或是选择文件中一个随机的行。

logstash整合elasticsearch问题

Logstash | 作者 lyk | 发布于2018年08月07日 | 阅读数:1640

当我的mapping这样设计的时候

"mappings": {
"doc":{
"dynamic":false,
"properties": {
"time":{
"type": "date"
},
"message_log":{
"type": "text",
"fields": {
"keyword":{
"type": "keyword"
}
}
},
"alarm_level":{
"type": "keyword"
},
"alarmlevel":{
"type": "integer"
},
"log_type":{
"type": "keyword"
},
"log_topic":{
"type": "keyword"
},
"source":{
"type": "text",
"fields": {
"keyword":{
"type": "keyword"
}
}
},
"host":{
"type": "ip"
}
}
}
}
logtash向es中插入数据如何去掉多余的字段呢?


"log_time": "May 21 10:57:44",
"port": 6677,
"offset": 5100,
"host": "134.96.252.89",
"@version": "1",
"log_topic": "harbor_container_test",
"@timestamp": "2018-08-07T07:15:34.552Z",
"alarm_level": "ERROR",
"message": [
"May",
"21",
"10:57:44",
"172.18.0.1",
"jobservice[1226]:",
"2018-05-21T02:57:44Z",
"[ERROR]",
"[utils.go:98]:",
"failed",
"to",
"connect",
"to",
"tcp://adminserver:8080,",
"retry",
"after",
"2",
"seconds",
":dial",
"tcp",
"172.18.0.3:8080:",
"getsockopt:",
"connection",
"refused"
],
"time": "2018-05-21T10:57:44.000Z",
"message_log": "May 21 10:57:44 172.18.0.1 jobservice[1226]: 2018-05-21T02:57:44Z [ERROR] [utils.go:98]: failed to connect to tcp://adminserver:8080, retry after 2 seconds :dial tcp 172.18.0.3:8080: getsockopt: connection refused",
"prospector": {
"type": "log"
},
"log_type": "harbor_container",
"component_name": "jobservice[1226]:",
"source": "/apps/apps/app/filebeat-6.1.3-linux-x86_64/134.108.1.101_6606",
"alarmlevel": "1",
"beat": {
"name": "ctgegh04",
"hostname": "ctgegh04",
"version": "6.1.3"
}
}

比如我上面的多余字段去掉logstash-output-elasticsearch里如何配置呢?我一直没找到demo;求大神来解决
已邀请:

zqc0512 - andy zhou

赞同来自:

logstash 加filter 啊 remove field.
 

要回复问题请先登录注册