2017-04-11 87 views
1

我目前在docker中運行ELK堆棧(https://github.com/deviantony/docker-elk),並且有一個獨立的java應用程序,我試圖使用log4j SocketAppender將日誌從logstash發送到該應用程序。當我在Kibana中查看我的日誌時,消息看起來編碼不正確。我對ELK堆棧非常陌生,並且嘗試了很多我在這裏找到的解決方案,但是我嘗試的任何東西似乎都不起作用。預先感謝您的幫助。使用log4j和ELK堆棧錯誤編碼的日誌消息堆棧

logstash.conf:

input { 
    log4j { 
     mode => "server" 
     host => "0.0.0.0" 
     port => 5000 
     type => "log4j" 
     } 
} 

## Add your filters/logstash plugins configuration here 

filter { 
    # All lines that does not start with %{TIMESTAMP} or ' ' + %{TIMESTAMP} belong to the previous event 
    multiline { 
     pattern => "(([\s]+)20[0-9]{2}-)|20[0-9]{2}-" 
     negate => true 
     what => "previous" 
    } 
} 

output { 
    elasticsearch { 
     hosts => "elasticsearch:9200" 
    } 
} 

log4j.properties:

log4j.rootLogger=info,tcp 

log4j.appender.tcp=org.apache.log4j.net.SocketAppender 
log4j.appender.tcp.Port=5000 
log4j.appender.tcp.RemoteHost=localhost 
log4j.appender.tcp.ReconnectionDelay=10000 
log4j.appender.tcp.Application=hello-world 
log4j.appender.myappender.encoding=UTF-8 

Kibana登錄: enter image description here

回答

1

原來,這是關係到是在無線ndows環境。從linux環境運行解決了編碼問題。我不知道是否有在Windows上解決編碼的問題的方式......

正確logstash配置爲TCP多支持,爲我工作:

input { 
    log4j { 
     mode => "server" 
     host => "0.0.0.0" 
     port => 5000 
     type => "log4j" 
     codec => multiline { 
      pattern => "^\s" 
      what => "previous" 
     } 
     } 
} 

## Add your filters/logstash plugins configuration here 

output { 
    elasticsearch { 
     hosts => "elasticsearch:9200" 
    } 
}