2013-10-24 44 views
0

我試圖在CentOS上安裝Spark。使用sbt/sbt assembly命令構建火花時,它會產生以下錯誤。在CentOS上安裝Spark時出現Java compliation錯誤

[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information. 
[warn]  getOutputCommitter().cleanupJob(getJobContext()) 
[warn]      ^
[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information. 
[warn]  jobCommitter.cleanupJob(jobTaskContext) 
[warn]    ^
[warn] two warnings found 
[error] ---------- 
[error] 1. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22) 
[error]   import io.netty.channel.ChannelFuture; 
[error]    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 
[error] The import io.netty.channel.ChannelFuture is never used 
[error] ---------- 
[error] 2. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23) 
[error]   import io.netty.channel.ChannelFutureListener; 
[error]    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 
[error] The import io.netty.channel.ChannelFutureListener is never used 
[error] ---------- 
[error] ---------- 
[error] 3. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23) 
[error]   import io.netty.channel.Channel; 
[error]    ^^^^^^^^^^^^^^^^^^^^^^^^ 
[error] The import io.netty.channel.Channel is never used 
[error] ---------- 
[error] ---------- 
[error] 4. WARNING in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20) 
[error]   import java.util.Arrays; 
[error]    ^^^^^^^^^^^^^^^^ 
[error] The import java.util.Arrays is never used 
[error] ---------- 
[error] ---------- 
[error] 5. ERROR in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36) 
[error]   public final Iterable<Double> apply(T t) { return call(t); } 
[error]          ^^^^^^^^^^ 
[error] The method apply(T) of type DoubleFlatMapFunction<T> must override a superclass method 
[error] ---------- 
[error] 5 problems (1 error, 4 warnings) 
[error] (core/compile:compile) javac returned nonzero exit code 
[error] Total time: 431 s, completed Oct 24, 2013 7:42:21 AM 

我的機器上安裝的java版本是1.7.0_45。
早些時候我使用jdk 1.6.0_35,它給出了相同的一組錯誤。 我也試過java 1.4給出了不同類型的錯誤。我應該使用哪個版本的Java?或者是其他問題?

+0

參考:這個問題也[跨發佈到火花用戶郵件列表(https://groups.google.com/d/msg/spark-users/ ti5UF15YBq4/du_Wzhr3uCEJ) –

回答