用了Elasticsearch,一口气上5T

filebeat offset 在收集的过程中丢失,导致被重置为 0

Beats | 作者 juin | 发布于2020年04月24日 | 阅读数:5558

filebeat 6.5.4
 
收集上百个文件,我们发现数据有重复收集的情况,通过线上的Debug 发现以下日志信息
 
2020-04-24T06:24:23.688+0800	DEBUG	[input]	log/input.go:494	Update existing file for harvesting: /log5/funshout.txt, offset: 426533
2020-04-24T06:24:23.688+0800 DEBUG [input] log/input.go:513 Old file was truncated. Starting from the beginning: /log5/funshout.txt, offset: 426485, new size: %!d(MISSING)
2020-04-24T06:24:23.688+0800 DEBUG [harvester] log/harvester.go:489 Setting offset for file based on seek: /log5/funshout.txt
2020-04-24T06:24:23.688+0800 DEBUG [harvester] log/harvester.go:475 Setting offset for file: /log5/funshout.txt. Offset: 0
2020-04-24T06:24:23.688+0800 DEBUG [harvester] log/harvester.go:390 Update state: /log5/funshout.txt, offset: 0

各位巨佬们有遇到过吗,或者有没有什么思路可以分享下
已邀请:

juin - 大数据开发

赞同来自:

重点在
Old file was truncated. Starting from the beginning: /log5/funshout.txt, offset: 426485, new size: %!d(MISSING)

要回复问题请先登录注册