我試圖從我的eclipse IDE中執行一個工作代碼,我面臨着我無法處理的奇怪錯誤。 試圖總結我的問題:Maven Shade Plugin和DataNucleus問題
- 用eclipse執行我的代碼:一切都很好。
- 捕獲eclipse拋出的命令行來運行我的應用程序,並將其複製到一個shell中:一切都很好。
現在,由eclipse生成的用於運行我的應用程序的命令行類似於java -cp lots-of-jars -Dvm.params myPackage.MyMainClass app-params
。
我的目標是與Oozie的作爲一個Java行動來執行我的應用程序,所以我需要建立一個超級罐子減少很多-的-罐到myapp.jar。
要做到這一點,我配置了Maven的樹蔭插件這樣的:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.2</version>
<configuration>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>es.mycompany.bigdata.OozieAction</mainClass>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
<transformer implementation="org.apache.maven.plugins.shade.resource.PluginXmlResourceTransformer" />
</transformers>
</configuration>
</execution>
</executions>
</plugin>
我因爲一些錯誤的,我開始面對我的應用程序(不能創建FsShell春對象添加一些變壓器,可以不啓動SparkContext ..) 順便說一下,我的應用程序的目的是下載一些天藍色的斑點,然後放入HDFS,用Spark轉換它們,最後添加到Hive表格。 我開發了Java應用程序(包括火花部分)並使用Spring來完成。
現在,當我試圖創建一個HiveContext出現我的最後一個新的問題(我的火花背景是確定的,因爲我的應用程序工作,如果我省略了蜂巢部分):
@Bean
@Lazy
@Scope("singleton")
public SQLContext getSQLContext(@Autowired JavaSparkContext sparkContext) {
return new HiveContext(sparkContext);
}
引發的錯誤:
2017-04-02 20:20:18 WARN Persistence:106 - Error creating validator of type org.datanucleus.properties.CorePropertyValidator
ClassLoaderResolver for class "" gave error on creation : {1}
org.datanucleus.exceptions.NucleusUserException: ClassLoaderResolver for class "" gave error on creation : {1}
...
Caused by: org.datanucleus.exceptions.NucleusUserException: Persistence process has been specified to use a ClassLoaderResolver of name "datanucleus" yet this has not been found by the DataNucleus plugin mechanism. Please check your CLASSPATH and plugin specification.
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:283)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
... 93 more
2017-04-02 20:20:18 WARN ExtendedAnnotationApplicationContext:550 - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'getOozieJavaAction': Unsatisfied dependency expressed through field 'sqlContext'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'getSQLContext' defined in es.mediaset.technology.bigdata.config.FlatJsonToCsvAppConfig: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.spark.sql.SQLContext]: Factory method 'getSQLContext' threw exception; nested exception is java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
正如我的代碼在正確地蝕運行,進出蝕的與這樣的命令
/usr/java/jdk1.8.0_121/bin/java -Dmode=responsive -Dspark.master=local[*] -Dfile.encoding=UTF-8 -classpath /home/cloudera/workspace-sts/oozie-eventhub-retriever/target/classes:/home/cloudera/workspace-sts/java-framework/target/classes:/home/cloudera/.m2/repository/com/microsoft/azure/azure-storage/5.0.0/azure-storage-5.0.0.jar:<...>:/etc/hive/conf.dist es.mycompany.technology.bigdata.OozieAction json2hive
我想我的燈罩配置是錯誤的。但我不明白爲什麼,我不能看到我在做什麼是錯的......
感謝
和你正在合併的這些「大量的罐子」,包括datanucleus-XXX罐子?如果是這樣,你合併了來自這些罐子的plugin.xml文件?和清單文件? –
當我使用eclipse命令從終端執行我的應用程序時,我的類路徑包含datanucleus-XXX.jars。如果我打開我的超級罐子,來自datanucleus-XXX.jars的類就在裏面。我按照你的建議添加了PluginXmlResourceTransformer,但結果相同。我的結果Manifest文件是:Manifest-Version:1.0 Build-Jdk:1.7.0_67 內置:cloudera 創建者:Apache Maven 3.3.9 主要類:es.mycompany.technology.bigdata。OozieAction Archiver-Version:Plexus Archiver – Cheloute
我不知道什麼是「PluginXmlResourceTransformer」或它做了什麼,所以誰知道它是否正確地完成了這項工作。如果你真的想一起綠色罐子建議你看看http://www.datanucleus.org/servlet/forum/viewthread_thread,8020_lastpage,yes#lastpost –