2017-02-09 73 views
0

我在火花獨立羣集上運行spark 2.0.2並部署了流式作業cluster部署模式。流式傳輸作業正常,但是在SPARK_HOME的工作目錄中創建的應用程序和驅動程序stderr文件存在問題。由於流媒體一直在運行,這些文件的大小隻有不斷增長,我不知道如何控制它。Spark流驅動程序和應用程序工作文件清理

我都試過,即使它們不完全與在手的問題,但我仍然嘗試和沒有工作了以下解決方案:

  1. Apache Spark does not delete temporary directories
  2. How to log using log4j to local file system inside a Spark application that runs on YARN?

燦任何人請幫助我如何限制正在創建的這些文件的大小?

P.S:我嘗試了在conf/spark-env.sh中添加以下行並重新啓動集羣的解決方案,但在運行流應用程序的情況下它不起作用。

export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true -Dspark.worker.cleanup.interval=60 -Dspark.worker.cleanup.appDataTtl=60" 

編輯

@YuvalItzchakov我也想你的建議,但沒有奏效。駕駛員的stderr日誌如下:

Launch Command: "/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java" "-cp" "/mnt/spark2.0.2/conf/:/mnt/spark2.0.2/jars/*" "-Xmx2048M" "-Dspark.eventLog.enabled=true" "-Dspark.eventLog.dir=/mnt/spark2.0.2/JobsLogs" "-Dspark.executor.memory=2g" "-Dspark.deploy.defaultCores=2" "-Dspark.io.compression.codec=snappy" "-Dspark.submit.deployMode=cluster" "-Dspark.shuffle.consolidateFiles=true" "-Dspark.shuffle.compress=true" "-Dspark.app.name=Streamingjob" "-Dspark.kryoserializer.buffer.max=128M" "-Dspark.master=spark://172.16.0.27:7077" "-Dspark.shuffle.spill.compress=true" "-Dspark.serializer=org.apache.spark.serializer.KryoSerializer" "-Dspark.cassandra.input.fetch.size_in_rows=20000" "-Dspark.executor.extraJavaOptions=-Dlog4j.configuration=file:///mnt/spark2.0.2/sparkjars/log4j.xml" "-Dspark.jars=file:/mnt/spark2.0.2/sparkjars/StreamingJob-assembly-0.1.0.jar" "-Dspark.executor.instances=10" "-Dspark.driver.extraJavaOptions=-Dlog4j.configuration=file:///mnt/spark2.0.2/sparkjars/log4j.xml" "-Dspark.driver.memory=2g" "-Dspark.rpc.askTimeout=10" "-Dspark.eventLog.compress=true" "-Dspark.executor.cores=1" "-Dspark.driver.supervise=true" "-Dspark.history.fs.logDirectory=/mnt/spark2.0.2/JobsLogs" "-Dlog4j.configuration=file:///mnt/spark2.0.2/sparkjars/log4j.xml" "org.apache.spark.deploy.worker.DriverWrapper" "spark://[email protected]:34475" "/mnt/spark2.0.2/work/driver-20170210124424-0001/StreamingJob-assembly-0.1.0.jar" "Streamingjob" 
======================================== 

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). 
log4j:WARN Please initialize the log4j system properly. 
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 
17/02/10 12:44:26 INFO SecurityManager: Changing view acls to: cassuser 
17/02/10 12:44:26 INFO SecurityManager: Changing modify acls to: cassuser 
17/02/10 12:44:26 INFO SecurityManager: Changing view acls groups to: 
17/02/10 12:44:26 INFO SecurityManager: Changing modify acls groups to: 

而且我log4j.xml文件看起來像這樣:

<?xml version="1.0" encoding="UTF-8"?> 
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd" > 
<log4j:configuration> 
    <appender name="stdout" class="org.apache.log4j.RollingFileAppender"> 
     <param name="threshold" value="TRACE"/> 
     <param name="File" value="stdout"/> 
     <param name="maxFileSize" value="1MB"/> 
     <param name="maxBackupIndex" value="10"/> 
     <layout class="org.apache.log4j.PatternLayout"> 
      <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n"/> 
     </layout> 
     <filter class="org.apache.log4j.varia.LevelRangeFilter"> 
      <param name="levelMin" value="ALL" /> 
      <param name="levelMax" value="OFF" /> 
     </filter> 
    </appender> 

    <appender name="stderr" class="org.apache.log4j.RollingFileAppender"> 
     <param name="threshold" value="WARN"/> 
     <param name="File" value="stderr"/> 
     <param name="maxFileSize" value="1MB"/> 
     <param name="maxBackupIndex" value="10"/> 
     <layout class="org.apache.log4j.PatternLayout"> 
      <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n"/> 
     </layout> 
    </appender> 
</log4j:configuration> 

注意,因爲它提供了一些錯誤,我已刪除在回答你的XML這根標籤:

<root> 
    <appender-ref ref="console"/> 
</root> 
+0

您可以使用滾動文件appender向工作人員和主人員提供自定義log4j.xml。 –

+0

@YuvalItzchakov你能告訴我該怎麼做。? –

回答

0

您可以使用自定義的log4j xml文件。

首先,聲明XML文件:

<?xml version="1.0" encoding="UTF-8"?> 
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd" > 
<log4j:configuration> 
    <appender name="stdout" class="org.apache.log4j.RollingFileAppender"> 
     <param name="threshold" value="TRACE"/> 
     <param name="File" value="stdout"/> 
     <param name="maxFileSize" value="50MB"/> 
     <param name="maxBackupIndex" value="100"/> 
     <layout class="org.apache.log4j.PatternLayout"> 
      <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n"/> 
     </layout> 
     <filter class="org.apache.log4j.varia.LevelRangeFilter"> 
      <param name="levelMin" value="ALL" /> 
      <param name="levelMax" value="OFF" /> 
     </filter> 
    </appender> 

    <appender name="stderr" class="org.apache.log4j.RollingFileAppender"> 
     <param name="threshold" value="WARN"/> 
     <param name="File" value="stderr"/> 
     <param name="maxFileSize" value="50MB"/> 
     <param name="maxBackupIndex" value="100"/> 
     <layout class="org.apache.log4j.PatternLayout"> 
      <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n"/> 
     </layout> 
    </appender> 
    <root> 
     <appender-ref ref="console"/> 
    </root> 
</log4j:configuration> 

然後,當你運行你的數據流工作,你需要通過extraJavaOptions傳遞log4j.xml文件星火主及工人:

spark-submit \ 
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///path/to/log4j.xml \ 
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///path/to/log4j.xml 

注主節點和工作節點上的路徑可能不同,具體取決於您將JAR和文件部署到Spark的方式。你說你使用集羣模式,所以我假設你手動調度JAR和額外的文件,但是對於在客戶端模式下運行這個的任何人,你還需要通過--files標誌添加xml文件。

+0

會試試這個.. !! –

+0

請參閱編輯的問題。我添加了驅動程序'stderr'日誌。 順便說一句,這沒有奏效。請告訴我,如果我做錯了什麼。 我正在使用: spark-submit --deploy-mode cluster --supervise --conf「spark.eventLog.enabled = true」--conf「spark.executor.extraJavaOptions = -Dlog4j」提交作業。configuration = file:///mnt/spark2.0.2/sparkjars/log4j.xml「--conf」spark.driver.extraJavaOptions = -Dlog4j.configuration = file:///mnt/spark2.0.2/sparkjars/log4j.xml 「--master spark://172.16.0.27:7077 --class Streamingjob /mnt/spark2.0.2/sparkjars/StreamingJob-assembly-0.1.0.jar –

+0

任何評論..? –

相關問題