使用netstat -lntp来看看有侦听在网络某端口的进程。当然,也可以使用 lsof。

Filebeat 的log里显示时间转换失败

Beats | 作者 lucasyu | 发布于2020年10月23日 | 阅读数:6694

我使用filbeat收集log然后直接发送到es 中间没有别的组件
filebeat和es的版本都是7.6.1 都是部署在docker上
发现的问题是在一条log的json中包含有2020-10-22 09:18:33.940 这样格式的时间无法转换成功,直接在日志里报错
kibana里看不到这条log
 
index都是由filebeat默认创建出来的
 
我该怎么修改一下。。是修改filebeat还是es?
 
这个是我到filebeat配置 (忽略缩进)
 
filebeat.inputs: 
- type: log 
enabled: true 
paths: - /usr/share/filebeat/logs/*.log 
json.key_under_root: true 
json.overwrite_keys: true 
json.message_keys: log 
 
setup.ilm.enabled: false 
setup.template.enabled: false 
setup.template.name: "indexname-*" 
setup.template.pattern: "indexname-*"
 
output.elasticseartch: ....... 
在filebeat的日志里显示的这个2020-10-22T01:18:35.257Z WARN elasticsearch/client.go:517 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbfdc555a85734d7b, ext:860191013075732, loc:(*time.Location)(0x50026a0)}, Meta:null, Fields:{ "agent":{"ephemeral_id":"dfc370f3-ade6-4d6c-b683-44ab96619b8b","hostname":"0894df2db561","id":"c792b8c0-6f63-4437-a382-5085132cf894","type":"filebeat","version":"7.6.1"}, "ecs":{"version":"1.4.0"}, "host":{"name":"0894df2db561"}, "input":{"type":"log"}, "json":{"@timestamp":"2020-10-22 09:18:33.940","class":"************","level":"INFO","log":"","logger_name":"***********","message":"************","sessionId":"***********","stackTrace":""}, "log":{"file":{"path":"/usr/share/filebeat/logs/***********.log"},"offset":36492124} }, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc00093b1e0), Source:"/usr/share/filebeat/logs/***********..log", Offset:36492428, Timestamp:time.Time{wall:0xbfdc55320b45949b, ext:860029115262547, loc:(*time.Location)(0x50026a0)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x4de18c8, Device:0x57}}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [json.@timestamp] of type [date] in document with id 'LUDiTXUBsI3miD783rnv'. Preview of field's value: '2020-10-22 09:18:33.940'","caused_by":{"type":"illegal_argument_exception","reason":"failed to parse date field [2020-10-22 09:18:33.940] with format [strict_date_optional_time||epoch_millis]","caused_by":{"type":"date_time_parse_exception","reason":"Failed to parse with all enclosed parsers"}}}
 
es中的日志是这个
 
{"type": "server", "timestamp": "2020-10-22T01:18:35,254Z", "level": "DEBUG", "component": "o.e.a.b.TransportShardBulkAction", "cluster.name": "docker-cluster", "node.name": "ee837b9fa50e", "message": "[index-2020.10.22][0] failed to execute bulk item (create) index {[index-2020.10.22][_doc][LUDiTXUBsI3miD783rnv], source[{\"@timestamp\":\"2020-10-22T01:18:34.091Z\",\"ecs\":{\"version\":\"1.4.0\"},\"host\":{\"name\":\"0894df2db561\"},\"agent\":{\"hostname\":\"0894df2db561\",\"id\":\"c792b8c0-6f63-4437-a382-5085132cf894\",\"version\":\"7.6.1\",\"type\":\"filebeat\",\"ephemeral_id\":\"dfc370f3-ade6-4d6c-b683-44ab96619b8b\"},\"log\":{\"offset\":36492124,\"file\":{\"path\":\"/usr/share/filebeat/logs/****.log\"}},\"json\":{\"sessionId\":\"**\",\"log\":\"\",\"@timestamp\":\"2020-10-22 09:18:33.940\",\"level\":\"INFO\",\"class\":\"***\",\"message\":\"*****\",\"logger_name\":\"***\",\"stackTrace\":\"\"},\"input\":{\"type\":\"log\"}}]}", "cluster.uuid": "_kKr5lLWQuWT37ezytSIAw", "node.id": "Zvlo5K8_Rc-uHNCdaHMItg" , "stacktrace": ["org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [json.@timestamp] of type [date] in document with id 'LUDiTXUBsI3miD783rnv'. Preview of field's value: '2020-10-22 09:18:33.940'", "at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:306) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:488) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.parseValue(DocumentParser.java:614) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:427) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:395) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:485) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:505) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:418) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:395) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.internalParseDocument(DocumentParser.java:112) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:71) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:267) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:793) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.shard.IndexShard.applyIndexOperation(IndexShard.java:770) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.shard.IndexShard.applyIndexOperationOnPrimary(IndexShard.java:742) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.bulk.TransportShardBulkAction.executeBulkItemRequest(TransportShardBulkAction.java:267) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.bulk.TransportShardBulkAction$2.doRun(TransportShardBulkAction.java:157) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.bulk.TransportShardBulkAction.performOnPrimary(TransportShardBulkAction.java:202) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:114) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:81) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryShardReference.perform(TransportReplicationAction.java:895) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.support.replication.ReplicationOperation.execute(ReplicationOperation.java:109) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.runWithPrimaryShardReference(TransportReplicationAction.java:374) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.lambda$doRun$0(TransportReplicationAction.java:297) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:63) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.shard.IndexShard.lambda$wrapPrimaryOperationPermitListener$24(IndexShard.java:2791) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.ActionListener$3.onResponse(ActionListener.java:113) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.shard.IndexShardOperationPermits.acquire(IndexShardOperationPermits.java:285) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.shard.IndexShardOperationPermits.acquire(IndexShardOperationPermits.java:237) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.shard.IndexShard.acquirePrimaryOperationPermit(IndexShard.java:2765) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.support.replication.TransportReplicationAction.acquirePrimaryOperationPermit(TransportReplicationAction.java:836) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.doRun(TransportReplicationAction.java:293) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.action.support.replication.TransportReplicationAction.handlePrimaryRequest(TransportReplicationAction.java:256) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:63) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.transport.TransportService$7.doRun(TransportService.java:750) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:692) [elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.6.1.jar:7.6.1]", "at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]", "at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]", "at java.lang.Thread.run(Thread.java:830) [?:?]", "Caused by: java.lang.IllegalArgumentException: failed to parse date field [2020-10-22 09:18:33.940] with format [strict_date_optional_time||epoch_millis]", "at org.elasticsearch.common.time.JavaDateFormatter.parse(JavaDateFormatter.java:169) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DateFieldMapper$DateFieldType.parse(DateFieldMapper.java:356) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DateFieldMapper.parseCreateField(DateFieldMapper.java:584) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:284) ~[elasticsearch-7.6.1.jar:7.6.1]", "... 41 more", "Caused by: java.time.format.DateTimeParseException: Failed to parse with all enclosed parsers", "at org.elasticsearch.common.time.JavaDateFormatter.doParse(JavaDateFormatter.java:196) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.common.time.JavaDateFormatter.parse(JavaDateFormatter.java:167) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DateFieldMapper$DateFieldType.parse(DateFieldMapper.java:356) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.DateFieldMapper.parseCreateField(DateFieldMapper.java:584) ~[elasticsearch-7.6.1.jar:7.6.1]", "at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:284) ~[elasticsearch-7.6.1.jar:7.6.1]", "... 41 more"] }
 
 
省略版的filbeat里面的log。。
{
"type": "mapper_parsing_exception",
"reason": "failed to parse field [json.@timestamp] of type [date] in document with id 'LUDiTXUBsI3miD783rnv'. Preview of field's value: '2020-10-22 09:18:33.940'",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "failed to parse date field [2020-10-22 09:18:33.940] with format [strict_date_optional_time||epoch_millis]",
"caused_by": {
"type": "date_time_parse_exception",
"reason": "Failed to parse with all enclosed parsers"
}
}
}
 
已邀请:

pineapple

赞同来自:

你最后贴出来的filebeat的日志里面讲了为啥解析有问题:


{
    "type": "mapper_parsing_exception",
    "reason": "failed to parse field [json.@timestamp] of type [date] in document with id 'LUDiTXUBsI3miD783rnv'. Preview of field's value: '2020-10-22 09:18:33.940'",
    "caused_by": {
        "type": "illegal_argument_exception",
        "reason": "failed to parse date field [2020-10-22 09:18:33.940] with format [strict_date_optional_time||epoch_millis]",
        "caused_by": {
            "type": "date_time_parse_exception",
            "reason": "Failed to parse with all enclosed parsers"
        }
    }
}


 官网上给出了 strict_date_optional_time||epoch_millis 可以解析的时间戳格式是:


date_optional_time or strict_date_optional_time
A generic ISO datetime parser, where the date must include the year at a minimum, and the time (separated by T), is optional. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd.



解决办法:
  •  去ES中找到 indexname-*的索引,修改它的mapping,把里面@timestamp的format修改为 'yyyy-MM-dd HH:mm:SS.SSS'

hz_zqc

赞同来自:

这个错是es报的 返回给filebeat的 问题就是楼上说的

要回复问题请先登录注册