2016-08-18 141 views
1

我有2個docker容器,1個運行Logstash,另一個運行Zookeeper和Kafka。我試圖將數據從Logstash發送到Kafka,但似乎無法將數據傳到我在卡夫卡的主題中。Docker:無法將數據從logstash容器發送到Kafka容器

我可以登錄到Docker Kafka容器,並從終端生成一條消息給我的主題,然後也將其消耗。

我使用的輸出卡夫卡插件:

output { 
    kafka { 
     topic_id => "MyTopicName" 
     broker_list => "kafkaIPAddress:9092" 
    } 
} 

的ip地址我從運行docker inspect kafka2

了當我運行./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf我得到這個錯誤。

Settings: Default pipeline workers: 4 
Unknown setting 'broker_list' for kafka {:level=>:error} 
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error} 
stopping pipeline {:id=>"main"} 

我檢查文件的配置,通過運行下面的命令返回OK。

./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf 
Configuration OK 

有沒有人遇到過這個,難道是我沒有打開卡夫卡容器上的端口,並保持卡夫卡運行如果是這樣我怎麼能做到這一點,而?

回答

1

的錯誤是在這裏broker_list => "kafkaIPAddress:9092"

嘗試bootstrap_servers => "KafkaIPAddress:9092" 如果你有單獨的機器上的容器,地圖卡夫卡到主機9092和使用主機地址:端口,如果同一主機上使用的內部泊塢IP:port

+0

Works 100%謝謝 – Gman

相關問題