2016-05-17 52 views
4

我想知道是否有人得到SASL在YARN上使用Spark 1.6.1?Spark 1.6.1 SASL

基本上星火文檔指出你只需要啓用了3個參數:

spark.authenticate.enableSaslEncryption=true  
spark.network.sasl.serverAlwaysEncrypt=true 
spark.authenticate=true 

http://spark.apache.org/docs/latest/security.html

然而,在與--master紗和--deploy模式客戶端啓動我的火花的工作,我見在我的火花執行日誌中的以下內容:

6/05/17 06:50:51 ERROR client.TransportClientFactory: Exception while bootstrapping client after 29 ms 

java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown message type: -22 
     at org.apache.spark.network.shuffle.protocol.BlockTransferMessage$Decoder.fromByteBuffer(BlockTransferMessage.java:67) 
     at org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.receive(ExternalShuffleBlockHandler.java:71) 
     at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149) 
     at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104) 
     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51) 
     at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86) 
     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) 
     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) 
     at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787) 
     at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) 
     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) 
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) 
     at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) 
     at java.lang.Thread.run(Thread.java:745) 

我仍然排除故障。然而,如果有人以前看到過這個,那將是非常棒的。

+0

也許這可以幫助:https://issues.apache.org/jira/browse/SPARK-6420 – RoyaumeIX

+0

嗨@Fabian譚,我面對完全一樣的問題。你有沒有設法調試呢? – shridharama

回答

3

您還需要在YARN中設置spark.authenticate=true

摘自星火代碼庫YarnShuffleService.java

* The service also optionally supports authentication. This ensures that executors from one 
* application cannot read the shuffle files written by those from another. This feature can be 
* enabled by setting `spark.authenticate` in the Yarn configuration before starting the NM. 
* Note that the Spark application must also set `spark.authenticate` manually and, unlike in 
* the case of the service port, will not inherit this setting from the Yarn configuration. This 
* is because an application running on the same Yarn cluster may choose to not use the external 
* shuffle service, in which case its setting of `spark.authenticate` should be independent of 
* the service's. 

您可以通過添加這樣做以下到您的hadoop配置core-site.xml

<property> 
    <name>spark.authenticate</name><value>true</value> 
</property>