2017-04-11 139 views
-3

數據在logstash版本5.0中錯過了很多, 它是一個嚴重的錯誤,當配置文件這麼多次,它沒用,數據丟失再次發生和agin,如何使用logstash來收集日誌事件屬性?Logstash中遺漏了數據?

任何答覆將thankness

+0

請爲了讓你的社區的幫助增加更多的細節。 –

回答

0

Logstash是所有關於閱讀的具體位置記錄,並根據您可以創建彈性搜索或其他輸出也可能指數感興趣的信息。 logstash的conf

input { 
file { 
# PLEASE SET APPROPRIATE PATH WHERE LOG FILE AVAILABLE 
     #type => "java" 
     type => "json-log" 
     path => "d:/vox/logs/logs/vox.json" 
     start_position => "beginning" 
      codec => json 
    } 
} 

filter { 
if [type] == "json-log" { 
    grok { 
     match => { "message" => "UserName:%{JAVALOGMESSAGE:UserName} -DL_JobID:%{JAVALOGMESSAGE:DL_JobID} -DL_EntityID:%{JAVALOGMESSAGE:DL_EntityID} -BatchesPerJob:%{JAVALOGMESSAGE:BatchesPerJob} -RecordsInInputFile:%{JAVALOGMESSAGE:RecordsInInputFile} -TimeTakenToProcess:%{JAVALOGMESSAGE:TimeTakenToProcess} -DocsUpdatedInSOLR:%{JAVALOGMESSAGE:DocsUpdatedInSOLR} -Failed:%{JAVALOGMESSAGE:Failed} -RecordsSavedInDSE:%{JAVALOGMESSAGE:RecordsSavedInDSE} -FileLoadStartTime:%{JAVALOGMESSAGE:FileLoadStartTime} -FileLoadEndTime:%{JAVALOGMESSAGE:FileLoadEndTime}" } 
     add_field => ["STATS_TYPE", "FILE_LOADED"] 
    } 

} 
} 
filter { 

    mutate { 
    # here converting data type 

     convert => { "FileLoadStartTime" => "integer" } 
     convert => { "RecordsInInputFile" => "integer" } 




    } 
} 

output { 
elasticsearch { 
# PLEASE CONFIGURE ES IP AND PORT WHERE LOG DOCs HAS TO PUSH 

document_type => "json-log" 
hosts => ["localhost:9200"] 
     # action => "index"  
     # host => "localhost" 
     index => "locallogstashdx_new" 
     # workers => 1 
} 
    stdout { codec => rubydebug } 
    #stdout { debug => true } 
} 

的 示例要知道更多,你可以去扔許多可用的網站如 https://www.elastic.co/guide/en/logstash/current/first-event.html