2016-03-01 91 views
1

我正在使用Apache Spark實現在線分類服務。試圖將應用程序提交給一個獨立的集羣這個劇本時,我有一個問題集成Dropwizard和Apache Spark

$SPARK_HOME/bin/spark-submit \ 
    --class com.example.msclassification.MscApplication\ 
    --master spark://192.168.55.165:7077 \ 
    --deploy-mode cluster \ 
    --executor-memory 2G \ 
    --total-executor-cores 4 \ 
    ./build/libs/msclassification-0.0.1-all.jar -server configuration.yml 

它給我這樣一個例外:

16/03/01 11:25:45 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
Exception in thread "main" java.lang.reflect.InvocationTargetException 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:606) 
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58) 
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala) 
Caused by: java.lang.IllegalStateException: Unable to acquire the logger context 
    at io.dropwizard.logging.LoggingUtil.getLoggerContext(LoggingUtil.java:46) 
    at io.dropwizard.logging.BootstrapLogging.bootstrap(BootstrapLogging.java:45) 
    at io.dropwizard.logging.BootstrapLogging.bootstrap(BootstrapLogging.java:34) 
    at io.dropwizard.Application.<init>(Application.java:24) 
    at com.example.msclassification.MscApplication.<init>(MscApplication.groovy) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
    at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77) 
    at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102) 
    at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:57) 
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:182) 
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:186) 
    at com.example.msclassification.MscApplication.main(MscApplication.groovy:23) 
    ... 6 more 

測試時,出現了一個衝突記錄和我已經解決了,使用以下gradle這個buildscript:

group 'com.example' 
version '0.0.1' 

apply plugin: 'groovy' 
apply plugin: 'com.github.johnrengelman.shadow' 
apply plugin: 'application' 

sourceCompatibility = 1.7 
mainClassName = "com.example.msclassification.MscApplication" 
repositories { 
    mavenCentral() 
} 
configurations.all { 
    resolutionStrategy { 
     force 'com.fasterxml.jackson.core:jackson-databind:2.4.4' 
    } 
} 
dependencies { 
    compile ('org.codehaus.groovy:groovy-all:2.3.11') 
    compile ("io.dropwizard:dropwizard-core:${project.properties.dropwizardVersion}") 
    compile ("io.dropwizard:dropwizard-jdbi:${project.properties.dropwizardVersion}") 
    compile ("org.elasticsearch:elasticsearch:${project.properties.elasticsearchVersion}") 
    compile 'org.slf4j:log4j-over-slf4j:1.7.18' 
    compile 'nz.ac.waikato.cms.weka:weka-dev:3.7.11' 
    compile 'org.codehaus.gpars:gpars:1.2.1' 
    compile ('commons-io:commons-io:2.4') 
    compile 'mysql:mysql-connector-java:5.1.38' 
    compile 'com.fasterxml.jackson.core:jackson-databind:2.4.4' 
    compile ('org.apache.spark:spark-core_2.10:1.6.0'){ 
     exclude group: 'org.slf4j' 
    } 
    compile ('org.apache.spark:spark-mllib_2.10:1.6.0'){ 
     exclude group: 'org.slf4j' 
    } 

// testCompile group: 'junit', name: 'junit', version: '4.11' 
    testCompile 'org.spockframework:spock-core:0.7-groovy-2.0' 
} 

buildscript { 
    repositories { jcenter() } 
    dependencies { 
     classpath 'com.github.jengelman.gradle.plugins:shadow:1.2.2' 
    } 
} 


//State the main entry and merge service files 
shadowJar{ 
    exclude 'META-INF/*.DSA' 
    exclude 'META-INF/*.RSA' 
    mergeServiceFiles() 
    zip64 true 
} 

runShadow{ 
    args 'server', "${project.properties.dropwizardConfig}" 
} 

在這個劇本,我已經使用jackson databind 2.4.4和被迫210 revole軟件包與Spark的衝突是使用老版本的jackson,它使用SLF4J LOG4J-12進行日誌綁定,無論Dropwizard是否使用Logback,並且沒有辦法排除這種情況或強制它使用log4j而沒有錯誤。你能提出一些建議嗎?我會感激不盡的感謝!

回答

0

您的路徑上可能包含log4j,從而導致getILoggerFactory返回this code中的LoggingContext以外的內容。

要找出哪些資源是log4j的拉動考慮運行:

gradle -q dependencyInsight --configuration compile --dependency log4j 

星火實際上是在log4j的明確拉動的依賴關係(不只是通過SLF4J),這樣很可能您的問題的根源。明確排除Spark依賴中的log4j,你應該很好。