2016-11-20 61 views
1

在這段代碼中,評論1中listbuffer項的長度顯示正確,但在第二個註釋中代碼從未執行。爲什麼會發生?Scala spark,listbuffer爲空

val conf = new SparkConf().setAppName("app").setMaster("local") 
val sc = new SparkContext(conf) 

var wktReader: WKTReader = new WKTReader(); 
val dataSet = sc.textFile("dataSet.txt") 

val items = new ListBuffer[String]() 
dataSet.foreach { e => 
    items += e 
    println("len = " + items.length) //1. here length is ok 
} 

println("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!") 
items.foreach { x => print(x)} //2. this code doesn't be executed 

日誌的位置:

16/11/20 01:16:52 INFO Utils: Successfully started service 'SparkUI' on port 4040. 
 
    16/11/20 01:16:52 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.56.1:4040 
 
    16/11/20 01:16:53 INFO Executor: Starting executor ID driver on host localhost 
 
    16/11/20 01:16:53 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58608. 
 
    16/11/20 01:16:53 INFO NettyBlockTransferService: Server created on 192.168.56.1:58608 
 
    16/11/20 01:16:53 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.56.1, 58608) 
 
    16/11/20 01:16:53 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:58608 with 347.1 MB RAM, BlockManagerId(driver, 192.168.56.1, 58608) 
 
    16/11/20 01:16:53 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.56.1, 58608) 
 
    Starting app 
 
    16/11/20 01:16:57 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 139.6 KB, free 347.0 MB) 
 
    16/11/20 01:16:58 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 15.9 KB, free 346.9 MB) 
 
    16/11/20 01:16:58 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.56.1:58608 (size: 15.9 KB, free: 347.1 MB) 
 
    16/11/20 01:16:58 INFO SparkContext: Created broadcast 0 from textFile at main.scala:25 
 
    16/11/20 01:16:58 INFO FileInputFormat: Total input paths to process : 1 
 
    16/11/20 01:16:58 INFO SparkContext: Starting job: foreach at main.scala:28 
 
    16/11/20 01:16:58 INFO DAGScheduler: Got job 0 (foreach at main.scala:28) with 1 output partitions 
 
    16/11/20 01:16:58 INFO DAGScheduler: Final stage: ResultStage 0 (foreach at main.scala:28) 
 
    16/11/20 01:16:58 INFO DAGScheduler: Parents of final stage: List() 
 
    16/11/20 01:16:58 INFO DAGScheduler: Missing parents: List() 
 
    16/11/20 01:16:58 INFO DAGScheduler: Submitting ResultStage 0 (dataSet.txt MapPartitionsRDD[1] at textFile at main.scala:25), which has no missing parents 
 
    16/11/20 01:16:58 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.3 KB, free 346.9 MB) 
 
    16/11/20 01:16:58 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2034.0 B, free 346.9 MB) 
 
    16/11/20 01:16:58 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.56.1:58608 (size: 2034.0 B, free: 347.1 MB) 
 
    16/11/20 01:16:58 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1012 
 
    16/11/20 01:16:59 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (dataSet.txt MapPartitionsRDD[1] at textFile at main.scala:25) 
 
    16/11/20 01:16:59 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks 
 
    16/11/20 01:16:59 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0, PROCESS_LOCAL, 5427 bytes) 
 
    16/11/20 01:16:59 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) 
 
    16/11/20 01:16:59 INFO HadoopRDD: Input split: file:/D:/dataSet.txt:0+291 
 
    16/11/20 01:16:59 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 
 
    16/11/20 01:16:59 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 
 
    16/11/20 01:16:59 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 
 
    16/11/20 01:16:59 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 
 
    16/11/20 01:16:59 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id 
 
    len = 1 
 
    len = 2 
 
    len = 3 
 
    len = 4 
 
    len = 5 
 
    len = 6 
 
    len = 7 
 
    16/11/20 01:16:59 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 989 bytes result sent to driver 
 
    16/11/20 01:16:59 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 417 ms on localhost (1/1) 
 
    16/11/20 01:16:59 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
 
    16/11/20 01:16:59 INFO DAGScheduler: ResultStage 0 (foreach at main.scala:28) finished in 0,456 s 
 
    16/11/20 01:16:59 INFO DAGScheduler: Job 0 finished: foreach at main.scala:28, took 0,795126 s 
 
    !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 
 
    16/11/20 01:16:59 INFO SparkContext: Invoking stop() from shutdown hook 
 
    16/11/20 01:16:59 INFO SparkUI: Stopped Spark web UI at http://192.168.56.1:4040 
 
    16/11/20 01:16:59 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 
 
    16/11/20 01:16:59 INFO MemoryStore: MemoryStore cleared 
 
    16/11/20 01:16:59 INFO BlockManager: BlockManager stopped 
 
    16/11/20 01:16:59 INFO BlockManagerMaster: BlockManagerMaster stopped 
 
    16/11/20 01:16:59 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 
 
    16/11/20 01:16:59 INFO SparkContext: Successfully stopped SparkContext 
 
    16/11/20 01:16:59 INFO ShutdownHookManager: Shutdown hook called 
 
    16/11/20 01:16:59 INFO ShutdownHookManager: Deleting directory

+0

你不必在ListBuffer添加這個,做的foreach,您可以直接從第一的foreach打印的元素,如果你想別的東西,你需要修改你的問題 – Shankar

+3

@ kangaroo0:同樣,如果你在本地運行它,它應該打印輸出,如果你在羣集上運行這個輸出,它不會打印值,因爲你在'transformations和actions'中編寫的邏輯運行在不同的機器上[完全不同JVM中] – Shankar

回答

2

這裏:

dataSet.foreach { e => 
    items += e 
    println("len = " + items.length) //1. here length is ok 
} 

修改的items執行人本地副本。驅動器上定義的原始列表items未被修改。因此:

items.foreach { x => print(x)} 

執行,但沒有什麼可打印。

請檢查Understanding closures