2017-08-04 316 views
-2

我只是使用火花蟒蛇約2個月,現在我想設置一些腳本火花流,所以我需要紗線和mesos作爲我的火花應用程序。然而,由於我使用這個命令 「火花提交--master紗file.py」 它總是顯示此問題:火花提交 - 主管紗線不能工作,並顯示錯誤


17/08/04 10:36:24 ERROR client.TransportClient: Failed to send RPC 8506915915091728278 to /192.168.11.164:55857: java.nio.channels.ClosedChannelException 
java.nio.channels.ClosedChannelException 
17/08/04 10:36:24 WARN netty.NettyRpcEndpointRef: Error sending message [message = RequestExecutors(0,0,Map())] in 1 attempts 
org.apache.spark.SparkException: Exception thrown in awaitResult 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75) 
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59) 
    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167) 
    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83) 
    at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102) 
    at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78) 
    at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply$mcV$sp(YarnSchedulerBackend.scala:271) 
    at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(YarnSchedulerBackend.scala:271) 
    at org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint$$anonfun$receiveAndReply$1$$anonfun$applyOrElse$1.apply(YarnSchedulerBackend.scala:271) 
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) 
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) 
    at java.lang.Thread.run(Thread.java:745) 
Caused by: java.io.IOException: Failed to send RPC 8506915915091728278 to /192.168.11.164:55857: java.nio.channels.ClosedChannelException 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:239) 
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:226) 
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680) 
    at io.netty.util.concurrent.DefaultPromise$LateListeners.run(DefaultPromise.java:845) 
    at io.netty.util.concurrent.DefaultPromise$LateListenerNotifier.run(DefaultPromise.java:873) 
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357) 
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) 
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) 
    ... 1 more 
Caused by: java.nio.channels.ClosedChannelException 
+0

火花版本是2.0和python版本是3.5.3 –

+0

你可以請張貼你的spark-submit命令嗎? –

+0

[root @ master addnewpaper]#spark-submit --master yarn demo1.py –

回答

0

無法發送RPC 8506915915091728278到/192.168.11.164:55857: java.nio.channels.ClosedChannelException

您在提交作業的機器和YARN之間是否存在連接?看起來像是存在連接問題。

+0

我的操作工程師說在一個節點上有一些問題 –

+0

我已經要求他們要求節點,他們告訴我,如果我使用了這麼多的內存,它會使節點死亡 –