2016-09-27 115 views
0

想象我有以下的CSV文件將數據插入elasticsearch使用logstash和kibana

tstp,voltage_A_real,voltage_B_real,voltage_C_real #header not present in actual file 
2000-01-01 00:00:00,2535.53,-1065.7,-575.754 
2000-01-01 01:00:00,2528.31,-1068.67,-576.866 
2000-01-01 02:00:00,2528.76,-1068.49,-576.796 
2000-01-01 03:00:00,2530.12,-1067.93,-576.586 
2000-01-01 04:00:00,2531.02,-1067.56,-576.446 
2000-01-01 05:00:00,2533.28,-1066.63,-576.099 
2000-01-01 06:00:00,2535.53,-1065.7,-575.754 
2000-01-01 07:00:00,2535.53,-1065.7,-575.754 
.... 

我試圖通過logstash來將數據插入到elasticsearch並具有以下logstash配置

input { 
    file { 
     path => "path_to_csv_file" 
     sincedb_path=> "/dev/null" 
     start_position => beginning 
    } 
} 
filter { 
    csv { 
     columns => [ 
      "tstp", 
      "Voltage_A_real", 
      "Voltage_B_real", 
      "Voltage_C_real" 
     ] 
     separator => "," 
     } 
    date { 
     match => [ "tstp", "yyyy-MM-dd HH:mm:ss"] 
    } 
    mutate { 
     convert => ["Voltage_A_real", "float"] 
     convert => ["Voltage_B_real", "float"] 
     convert => ["Voltage_C_real", "float"] 
    } 
} 
output { 
    stdout { codec => rubydebug } 
    elasticsearch { 
     hosts => ["localhost:9200"] 
     action => "index" 
     index => "temp_load_index" 
    } 
} 

我在運行logstash -f conf_file -v時從rubydebug輸出的是

{ 
      "message" => "2000-02-18 16:00:00,2532.38,-1067,-576.238", 
      "@version" => "1", 
     "@timestamp" => "2000-02-18T21:00:00.000Z", 
       "path" => "path_to_csv", 
       "host" => "myhost", 
       "tstp" => "2000-02-18 16:00:00", 
    "Voltage_A_real" => 2532.38, 
    "Voltage_B_real" => -1067.0, 
    "Voltage_C_real" => -576.238 
} 

但是,當我查看儀表板並且都具有當前日期時間戳而不是2000年數據的範圍時,我在kibana中只看到2個事件。有人能幫我弄清楚發生了什麼嗎?

樣品kibana目的是如下

{ 
    "_index": "temp_load_index", 
    "_type": "logs", 
    "_id": "myid", 
    "_score": null, 
    "_source": { 
    "message": "2000-04-02 02:00:00,2528.76,-1068.49,-576.796", 
    "@version": "1", 
    "@timestamp": "2016-09-27T05:15:29.753Z", 
    "path": "path_to_csv", 
    "host": "myhost", 
    "tstp": "2000-04-02 02:00:00", 
    "Voltage_A_real": 2528.76, 
    "Voltage_B_real": -1068.49, 
    "Voltage_C_real": -576.796, 
    "tags": [ 
     "_dateparsefailure" 
    ] 
    }, 
    "fields": { 
    "@timestamp": [ 
     1474953329753 
    ] 
    }, 
    "sort": [ 
    1474953329753 
    ] 
} 
+0

你看到在Kibana這兩個事件具有'_dateparsefailure'標記,因此這意味着'tstp'場無法解析,所以'@ timestamp'場不是從價值取代日誌,留下Logstash接收日誌的時間。 – baudsp

回答

0

當打開Kibana,它通常顯示你僅在最後15分鐘的事件,根據@timestamp字段。因此,您需要使用絕對選項並從2000-01-01開始,將時間過濾器設置爲適當的時間範圍(參見documentation)。

或者您可以將解析的時間戳放在另一個字段中(例如original_tst),以便Logstash添加的@timestamp將被保留。

date { 
    match => [ "tstp", "yyyy-MM-dd HH:mm:ss"] 
    target => "original_tst" 
} 
相關問題