使用gelf收集日志,并输出到kafka,现在运行一段时间就挂掉在日志中出现了ERROR级别的日志,分析了hprof发现还剩余很多内存。总共128M用了40M,剩余80M
(1)先出现3条:logstash.inputs.gelf JSON parse failure.Falling back to plain-text 'Exception:Failed to decode data:Java heap space'
(2)紧接着:报了一个ArrayIndexOutBoundsException: Index -1 out of bounds for length 20
(3)下面是好多:Exception in pipelineworker,the pipeline stopped processing new events,please check you filter configuration and restart Logstash.IllegalStateException:cannot perform operation after producer has been closed.
1 个回复
locatelli
赞同来自:
另外128M heap太小了,增加heap至少可以缓解这个问题。