我嘗試通過簡單的Spark Job(用Java編寫)查詢Hive表。Spark SQLContext未找到配置單元表
SparkConf conf = new SparkConf().setMaster("local[*]").setAppName("MyJob");
JavaSparkContext sc = new JavaSparkContext(conf);
SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
DataFrame df = sqlContext.table("scf");
,但是當我通過提交罐子火花提交,我有以下錯誤:
Exception in thread "main" org.apache.spark.sql.catalyst.analysis.NoSuchTableException
at org.apache.spark.sql.catalyst.analysis.SimpleCatalog.lookupRelation(Catalog.scala:108)
at org.apache.spark.sql.SQLContext.table(SQLContext.scala:831)
at org.apache.spark.sql.SQLContext.table(SQLContext.scala:827)
at MyJob.myJob(MyJob.java:30)
at MyJob.main(MyJob.java:65)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
我敢肯定,表存在。如果我在spark-shell中運行sqlContext.table(「scf」)。count,它會給我結果。
可能是什麼問題?
謝謝!
您需要註冊表名 「DCF」,我猜。 –