2016-04-29 61 views
2

我試圖用scalatest執行單元測試,利用hbase測試工具來本地測試開發代碼。 sbt中的hbase測試工具的設置就是現在的鬥爭。當我編譯,我得到以下錯誤:sbt依賴管理問題與hbase-testing-utility

[warn] module not found: org.apache.hbase#${compat.module};1.2.1 
[warn] ==== local: tried 
[warn] /root/.ivy2/local/org.apache.hbase/${compat.module}/1.2.1/ivys/ivy.xml 
[warn] ==== public: tried 
[warn] https://repo1.maven.org/maven2/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom 
[warn] ==== Akka Repository: tried 
[warn] http://repo.akka.io/releases/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom 
[warn] ==== scala-tools: tried 
[warn] https://oss.sonatype.org/content/groups/scala-tools/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom 
[warn] ==== cloudera-repos: tried 
[warn] https://repository.cloudera.com/artifactory/cloudera-repos/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom 
[warn] ==== Sonatype OSS Snapshots: tried 
[warn] https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/${compat.module}/1.2.1/${compat.module}-1.2.1.pom 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] ::   UNRESOLVED DEPENDENCIES   :: 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] :: org.apache.hbase#${compat.module};1.2.1: not found 
[warn] :::::::::::::::::::::::::::::::::::::::::::::: 
[warn] 
[warn] Note: Unresolved dependencies path: 
[warn]  org.apache.hbase:${compat.module}:1.2.1 
[warn]  +- org.apache.hbase:hbase-testing-util:1.2.1 (/workspace/spark/etl/built.sbt#L30-62) 

[trace] Stack trace suppressed: run last *:update for the full output. 
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.hbase#${compat.module};1.2.1: not found 
[error] Total time: 32 s, completed Apr 29, 2016 9:25:27 AM 

我build.sbt文件如下:

val hbaseVersion = "1.2.1" 
val sparkVersion = "1.6.1" 
val hadoopVersion = "2.7.1" 

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % sparkVersion % "provided", 
    "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided", 
    "org.apache.spark" %% "spark-streaming-kafka" % sparkVersion, 
    "org.apache.spark" %% "spark-sql" % sparkVersion % "provided", 
    "org.apache.spark" %% "spark-mllib" % sparkVersion , 
    "org.apache.hbase" % "hbase" % hbaseVersion, 
    "org.apache.hbase" % "hbase-server" % hbaseVersion, 
    "org.apache.hbase" % "hbase-server" % hbaseVersion classifier "tests", 
    "org.apache.hbase" % "hbase-client" % hbaseVersion, 
    "org.apache.hbase" % "hbase-common" % hbaseVersion, 
    "org.apache.hbase" % "hbase-common" % hbaseVersion classifier "tests", 
    "org.apache.hbase" % "hbase-annotations" % hbaseVersion, 
    "org.apache.hbase" % "hbase-testing-util" % hbaseVersion % "test", 
    "org.apache.hadoop" % "hadoop-minicluster" % hadoopVersion, 
    "org.apache.hadoop" % "hadoop-mapreduce-client-jobclient" % hadoopVersion classifier "tests", 
    "org.apache.hadoop" % "hadoop-hdfs" % hadoopVersion, 
    "org.apache.hadoop" % "hadoop-hdfs" % hadoopVersion classifier "tests", 
    "org.apache.hbase" % "hbase-hadoop-compat" % hbaseVersion, 
    "org.apache.hbase" % "hbase-hadoop-compat" % hbaseVersion classifier "tests", 
    "org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion, 
    "org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion classifier "tests", 
    "org.apache.hadoop" % "hadoop-common" % hadoopVersion, 
    "org.apache.hadoop" % "hadoop-common" % hadoopVersion classifier "tests", 
    "org.apache.hadoop" % "hadoop-annotations" % hadoopVersion, 
    "org.scalatest" %% "scalatest" % "2.2.6" % "test" , 
    //"org.scalacheck" %% "scalacheck" % "1.12.5" % "test", 
    "com.cloudera.sparkts" % "sparkts" % "0.3.0", 
    "com.ecwid.consul" % "consul-api" % "1.1.9", 
    "joda-time" % "joda-time" % "2.7" 
) 

resolvers ++= Seq(
    "Akka Repository" at "http://repo.akka.io/releases/", 
    "scala-tools" at "https://oss.sonatype.org/content/groups/scala-tools", 
    "cloudera-repos" at "https://repository.cloudera.com/artifactory/cloudera-repos/", 
    "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots" 
) 

任何人都明白爲什麼這個故障發生?

+0

貌似某人犯了一個錯誤,而發佈的HBase-1.2.1文物到Maven固定!你可以嘗試將'hbaseVersion'改爲'1.2.0-cdh5.7.0'並測試構建是否通過。 – Shyam

+0

你解決了這個問題嗎?它適用於我的hbase版本「0.98.7-hadoop2」,但它與所有其他版本都失敗。我的回購:https://github.com/dportabella/spark-examples –

+0

看來,hbase有一個錯誤。請參閱此處的有效版本列表: https://issues.apache.org/jira/browse/HBASE-8488?focusedCommentId=15307889 –

回答

3

抱歉,延遲響應。我無法按原樣工作,所以我改變了這樣的版本:

val sparkVersion = "1.6.1" 
val hbaseVersion = "1.2.0-cdh5.7.0" 
val hadoopVersion = "2.6.0-cdh5.7.0" 

這導致了更多的令人頭疼的問題。我不得不番石榴的版本改變,因爲到舊庫方法的引用的早期版本,所以這是必需的:

"com.google.guava" % "guava" % "14.0" force() 

(我想高達16.0版本是罰款)

不得不註釋掉以下:

// "com.cloudera" % "spark-hbase" % "0.0.2-clabs", 

(是不存在原q)

最後,看起來像原來的問題是需要解決的一個錯誤,請看到這裏,謝謝從大衛Portabella參考:

https://issues.apache.org/jira/browse/HBASE-15925

而且隨着版本1.2.2