0
當我嘗試使用最新的Spark與關卡流火花流:警告通過版本2.0.1
cfg = SparkConf().setAppName('MyApp').setMaster('local[3]')
sc = SparkContext(conf=cfg)
ssc = StreamingContext(sparkContext=sc, batchDuration=1)
ssc.checkpoint('checkpoint')
然後我得到了這個一再警告:
-------------------------------------------
Time: 2016-10-11 10:08:02
-------------------------------------------
('world', 1)
('hello', 1)
16/10/11 10:08:06 WARN DFSClient: Caught exception
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1281)
at java.lang.Thread.join(Thread.java:1355)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:609)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:370)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:546)
-------------------------------------------
Time: 2016-10-11 10:08:03
-------------------------------------------
('world', 1)
('hello', 1)
那是什麼?它看起來像HDFS的警告
這是一個重要的信息?
我真的確定有沒有火花WARN版本2.0.0
我認爲問題是hadoop-hdfs.jar從v2.7.2升級到v2.7.3。 Spark 2.0.0使用2.7.2,而Spark 2.0.1使用2.7.3 –
@KenjiNoguchi是的!這就是原因。將hadoop-hdfs-2.7.2.jar從2.0.0複製到2.0.1後,沒有WARN! –