2017-06-14 185 views
1

我試圖導入CSV到使用logstash 我一直在使用兩種方法試圖elasticsearch:elasticsearch - 使用logstash日期導入CSV不會被解析爲時間類型

  1. 使用CSV
  2. 使用神交過濾
  3. 下面

1)爲CSV是我logstash文件:

input { 
    file { 
    path => "path_to_my_csv.csv" 
    start_position => "beginning" 
    sincedb_path => "/dev/null" 
    } 
} 
filter { 
    csv { 
     separator => "," 
     columns => ["col1","col2_datetime"] 
    } 
    mutate {convert => [ "col1", "float" ]} 
    date { 
     locale => "en" 
     match => ["col2_datetime", "ISO8601"] // tried this one also - match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"] 
     timezone => "Asia/Kolkata" 
     target => "@timestamp" // tried this one also - target => "col2_datetime" 
    } 
} 
output { 
    elasticsearch { 
    hosts => "http://localhost:9200" 
    index => "my_collection" 

    } 
    stdout {} 
} 

2)使用神交過濾器:下面

對於神交過濾器是我logstash文件

input { 
    file { 
    path => "path_to_my_csv.csv" 
    start_position => "beginning" 
    sincedb_path => "/dev/null" 
    } 
} 
filter { 
    grok { 
    match => { "message" => "(?<col1>(?:%{BASE10NUM})),(%{TIMESTAMP_ISO8601:col2_datetime})"} 
    remove_field => [ "message" ] 
    } 
    date { 
     match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"] 
    } 
} 
output { 
    elasticsearch { 
    hosts => "http://localhost:9200" 
    index => "my_collection_grok" 

    } 
    stdout {} 
} 

問題:

所以,當我單獨運行這兩個文件,我可以在elasticsearch中導入數據。但我的日期字段沒有被解析到日期時間類型,而是它已被保存爲字符串,因此我無法運行日期過濾器。

那麼有人可以幫助我弄清楚爲什麼會發生。 我的elasticsearch版本是5.4.1。

在此先感謝

+0

你能分享af你的CSV文件中有多少行? – shantanuo

+0

請檢查該 1234365,2016-12-02 19時00分52秒 1234368,2016-12-02 15時02分02秒 1234369,2016-12-02 15時02分07秒 – pravindot17

回答

0

我對您的配置文件有2個更改。

1)刪除under_score列名col2_datetime

2)添加目標

下面是我的配置文件的樣子......

vi logstash.conf 

input { 
    file { 
    path => "/config-dir/path_to_my_csv.csv" 
    start_position => "beginning" 
    sincedb_path => "/dev/null" 
    } 
} 
filter { 
    csv { 
     separator => "," 
     columns => ["col1","col2"] 
    } 
    mutate {convert => [ "col1", "float" ]} 
    date { 
     locale => "en" 
     match => ["col2", "yyyy-MM-dd HH:mm:ss"] 
     target => "col2" 
    } 
} 
output { 
    elasticsearch { 
    hosts => "http://172.17.0.1:9200" 
    index => "my_collection" 

    } 
    stdout {} 
} 

這裏是數據文件:

vi path_to_my_csv.csv 

1234365,2016-12-02 19:00:52 
1234368,2016-12-02 15:02:02 
1234369,2016-12-02 15:02:07 
+0

它仍然作爲插入字符串 – pravindot17

+0

Logstash版本?我用5.4版本測試了它。 – shantanuo

+0

我也是一樣的 – pravindot17