使用 shuf 来打乱一个文件中的行或是选择文件中一个随机的行。

索引字段 "@timestamp" 存储 UNIX_MS , logstash无法解析.

Logstash | 作者 tithonus | 发布于2020年06月21日 | 阅读数:3256

 
示例: 
两条数据:
第一条使用 "@timestamp" : "2020-06-16T14:36:39.000Z",
第二条使用 "@timestamp" : "1592289396859",
POST mydba_logstash_index/logs
{
"traceId" : "9055b02343504acb96e5ef3a37cb53e2",
"@timestamp" : "2020-06-16T14:36:39.000Z",
"sourceIp" : "192.168.78.5",
"timestamp" : "1592289396809"
}
 
POST mydba_logstash_index/logs
{
"traceId" : "9055b02343504acb96e5ef3a37cb53e2",
"@timestamp" : "1592289396859",
"sourceIp" : "192.168.78.5",
"timestamp" : "1592289396809"
}
 
GET mydba_logstash_index/_search

{
"took" : 2,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : 2,
"max_score" : 1.0,
"hits" : [
{
"_index" : "mydba_logstash_index",
"_type" : "logs",
"_id" : "k_l813IBQM_nnOshyGjN",
"_score" : 1.0,
"_source" : {
"traceId" : "9055b02343504acb96e5ef3a37cb53e2",
"@timestamp" : "2020-06-16T14:36:39.000Z",
"sourceIp" : "192.168.78.5",
"timestamp" : "1592289396809"
}
},
{
"_index" : "mydba_logstash_index",
"_type" : "logs",
"_id" : "sfl813IBQM_nnOsh0miT",
"_score" : 1.0,
"_source" : {
"traceId" : "9055b02343504acb96e5ef3a37cb53e2",
"@timestamp" : "1592289396859",
"sourceIp" : "192.168.78.5",
"timestamp" : "1592289396809"
}
}
]
}
}




 
logstash配置文件: 
cat  my_dba_text.conf

input {
elasticsearch {
hosts => "10.nn.nn.nn:9200"
user => "elastic"
password => "elk123"
index => "mydba_logstash_index"
docinfo => true
size => 100
scroll => "1m"
slices => 1
}
}


output {
stdout{
codec=>rubydebug
}
}




执行:
logstash -f my_dba_text.conf 
 
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-06-21T23:29:20,759][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-06-21T23:29:20,770][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.8.9"}
[2020-06-21T23:29:26,316][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2020-06-21T23:29:26,648][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7f93baaf run>"}
[2020-06-21T23:29:26,686][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-06-21T23:29:26,975][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-06-21T23:29:27,406][WARN ][org.logstash.Event ] Error parsing @timestamp string value=1592289396859
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
"@timestamp" => 2020-06-21T15:29:27.406Z,
"timestamp" => "1592289396809",
"_@timestamp" => "1592289396859",
"traceId" => "9055b02343504acb96e5ef3a37cb53e2",
"sourceIp" => "192.168.78.5",
"@version" => "1",
"tags" => [
[0] "_timestampparsefailure"
]
}
{
"@timestamp" => 2020-06-16T14:36:39.000Z,
"timestamp" => "1592289396809",
"traceId" => "9055b02343504acb96e5ef3a37cb53e2",
"sourceIp" => "192.168.78.5",
"@version" => "1"
}
[2020-06-21T23:29:27,723][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x7f93baaf run>"}
[2020-06-21T23:29:27,813][INFO ][logstash.runner ] Logstash shut down.
[root@hz-es-logstash-199-151-96 conf.d]#

 
 
请教, 对于 "@timestamp" : "1592289396859", 存储UNIX_MS要如何解析?
目的是想从 把一个ES集群中的索引迁移到另一个ES集群中, 且保持目标集群中索引的"@timestamp" 值也为原端索引的 "@timestamp"  UNIX_MS 值.
 
初次使用,请提供下示例参照, 谢谢.
 
 
 
 
 
已邀请:

tithonus

赞同来自:

尝试过 filter,也没能解决解析问题.
filter {
date {
locale => "en"
match => [ "@timestamp" , "UNIX_MS" ]
target => [ "@timestamp" ]
}
}
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-06-21T23:50:47,070][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-06-21T23:50:47,081][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.8.9"}
[2020-06-21T23:50:53,190][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2020-06-21T23:50:53,522][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0xeaee655 run>"}
[2020-06-21T23:50:53,591][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-06-21T23:50:53,980][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-06-21T23:50:54,342][WARN ][org.logstash.Event ] Error parsing @timestamp string value=1592289396859
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
"traceId" => "9055b02343504acb96e5ef3a37cb53e2",
"@version" => "1",
"@timestamp" => 2020-06-16T14:36:39.000Z,
"sourceIp" => "192.168.78.5",
"timestamp" => "1592289396809",
"tags" => [
[0] "_dateparsefailure"
]
}
{
"_@timestamp" => "1592289396859",
"traceId" => "9055b02343504acb96e5ef3a37cb53e2",
"@version" => "1",
"@timestamp" => 2020-06-21T15:50:54.343Z,
"sourceIp" => "192.168.78.5",
"timestamp" => "1592289396809",
"tags" => [
[0] "_timestampparsefailure",
[1] "_dateparsefailure"
]
}
[2020-06-21T23:50:54,731][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0xeaee655 run>"}
[2020-06-21T23:50:55,221][INFO ][logstash.runner ] Logstash shut down.

 
 

要回复问题请先登录注册