2015-10-20 401 views
3

當我在調試模式下在hive控制檯上運行查詢時,出現如下所示的錯誤。我使用的是hive-1.2.1和spark 1.5.1;我檢查了具有類定義org/apache/hive/spark/client/Job的hive-exec jar。Hive On Spark:java.lang.NoClassDefFoundError:org/apache/hive/spark/client /作業

Caused by: java.lang.NoClassDefFoundError: org/apache/hive/spark/client/Job 
    at java.lang.ClassLoader.defineClass1(Native Method) 
    at java.lang.ClassLoader.defineClass(ClassLoader.java:792) 
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) 
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) 
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361) 
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
    at java.security.AccessController.doPrivileged(Native Method) 
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424) 
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:411) 
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 
    at java.lang.Class.forName0(Native Method) 
    at java.lang.Class.forName(Class.java:270) 
    at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136) 
    at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115) 
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656) 
    at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99) 
    at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507) 
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776) 
    at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:96) 
    at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42) 
    at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:327) 
    ... 15 more* 

最後查詢失敗:

"ERROR spark.SparkTask: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'"*

我怎樣才能解決這個問題?

+1

你有火花羣集上hivecontext? – eliasah

+1

@eliasah,我有火花上的hiveContext。查詢在spark-sql上正常工作,但不在火花上配置Hive-on-spark – Arvindkumar

+1

此問題已通過移動到1.3.0版本並在沒有配置單元的情況下重新構建而解決。 – Arvindkumar

回答

1

在蜂房1.2.1的pom.xml中,spark.version是1.3.1

所以,最簡單的辦法就是下載中心從spark.apache.org火花1.3.1彬的Hadoop。

然後,添加它的蜂巢-site.xml的路徑,如:

<property> 
    <name>spark.home</name> 
    <value>/path/spark-1.3.1-bin-hadoop2.4</value> 
</property>