logstash往kafka里同步数据的时候kafka经常超时,然后logstash会停止
Logstash | 作者 aoliao_paopao | 发布于2018年05月30日 | 阅读数:3760
报错信息
[2018-05-30T14:37:07,162][INFO ][org.apache.kafka.clients.producer.KafkaProducer] [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
[2018-05-30T14:37:07,189][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x51c394c6 run>"}
[2018-05-30T14:37:07,162][INFO ][org.apache.kafka.clients.producer.KafkaProducer] [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
[2018-05-30T14:37:07,189][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x51c394c6 run>"}
0 个回复