2017-08-30 92 views
0

我想與Kafka集成一起使用Spark流。我使用Spark 2.0.0版。sbt未解決火花傳輸依賴關係卡夫卡集成

但是我得到一個無法解析的依賴錯誤(「未解析的依賴關係:org.apache.spark#spark-sql-kafka-0-10_2.11; 2.0.0:not found」)。

如何訪問此包?或者我做錯了什麼/失蹤?

我build.sbt文件:

name := "Spark Streaming" 
version := "0.1" 
scalaVersion := "2.11.11" 
val sparkVersion = "2.0.0" 

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion, 
    "org.apache.spark" %% "spark-sql" % sparkVersion, 
    "org.apache.spark" %% "spark-streaming" % sparkVersion, 
    "org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion 
) 
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview" 

謝謝你的幫助。

回答