2015-07-21 51 views
0

我試圖構建Apache星火SQL核心(1.4.1),我得到以下堆棧跟蹤。但是如果我構建整個Spark項目,一切進展順利,並且該建築物成功完成。有任何想法嗎?構建Apache星火SQL核心

堆棧跟蹤

[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:258: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil 
[error]   SparkHadoopUtil.get.globPathIfNecessary(qualified) 
[error]       ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:263: value map is not a member of Array[Nothing] 
[error]   globbedPaths.map(_.toString), None, None, extraOptions.toMap)(sqlContext)) 
[error]      ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/MonotonicallyIncreasingID.scala:22: object Nondeterministic is not a member of package org.apache.spark.sql.catalyst.expressions 
[error] import org.apache.spark.sql.catalyst.expressions.{Nondeterministic, LeafExpression} 
[error]  ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/MonotonicallyIncreasingID.scala:36: not found: type Nondeterministic 
[error] private[sql] case class MonotonicallyIncreasingID() extends LeafExpression with Nondeterministic { 
[error]                    ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/SparkPartitionID.scala:22: object Nondeterministic is not a member of package org.apache.spark.sql.catalyst.expressions 
[error] import org.apache.spark.sql.catalyst.expressions.{Nondeterministic, LeafExpression} 
[error]  ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/SparkPartitionID.scala:30: not found: type Nondeterministic 
[error] private[sql] case object SparkPartitionID extends LeafExpression with Nondeterministic { 
[error]                  ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/sources/ddl.scala:252: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil 
[error]    SparkHadoopUtil.get.globPathIfNecessary(qualifiedPattern).map(_.toString).toArray 
[error]        ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/sources/ddl.scala:279: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil 
[error]    SparkHadoopUtil.get.globPathIfNecessary(qualifiedPattern).map(_.toString).toArray 
[error]        ^
[error] 8 errors found 
[debug] Compilation failed (CompilerInterface) 
[error] Compile failed at Jul 21, 2015 5:57:38 AM [29.605s] 
[INFO] ------------------------------------------------------------------------ 
[INFO] BUILD FAILURE 
[INFO] ------------------------------------------------------------------------ 
[INFO] Total time: 38.435s 
[INFO] Finished at: Tue Jul 21 05:57:38 UTC 2015 
[INFO] Final Memory: 37M/609M 
[INFO] ------------------------------------------------------------------------ 
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-sql_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1] 
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-sql_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:225) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) 
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) 
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) 
    at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) 
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) 
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) 
    at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) 
    at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) 
    at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) 
    at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415) 
    at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356) 
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. 
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:110) 
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) 
    ... 19 more 
Caused by: Compile failed via zinc server 
    at sbt_inc.SbtIncrementalCompiler.zincCompile(SbtIncrementalCompiler.java:136) 
    at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:86) 
    at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303) 
    at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119) 
    at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99) 
    at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482) 
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) 
    ... 20 more 
[ERROR] 
[ERROR] 

回答

0

那麼有在SparkHadoopUtil沒有globPathIfNecessary所以它必須是你自己的修改。當您從頂層運行構建時,Maven reactor可以看到整個項目並可以看到您的更改。當您從子項目運行構建時,Maven會在本地回購的子項目外尋找所有內容,因此除非您安裝了它們,否則無法看到任何修改。因此,請從頂層再次運行構建版本,但請執行install而不是package以將修改安裝到本地回購站中。一旦你這樣做,從sql/core執行構建應該能夠成功地解決你的更改。