2016-06-13 70 views
3

我真的試圖連接通過多鹼在SQL Server中的Hadoop 2016年 我的代碼是:多鹼外部表的訪問失敗 - 拒絕權限

CREATE EXTERNAL DATA SOURCE MyHadoopCluster WITH ( 
     TYPE = HADOOP, 
     LOCATION ='hdfs://192.168.114.20:8020', 
     credential= HadoopUser1 
    ); 


CREATE EXTERNAL FILE FORMAT TextFileFormat WITH ( 
     FORMAT_TYPE = DELIMITEDTEXT, 
    FORMAT_OPTIONS (FIELD_TERMINATOR ='\001', 
      USE_TYPE_DEFAULT = TRUE) 
); 


CREATE EXTERNAL TABLE [dbo].[test_hadoop] ( 
     [Market_Name] int NOT NULL, 
     [Claim_GID] int NOT NULL, 
     [Completion_Flag] int NULL, 
     [Diag_CDE] float NOT NULL, 
     [Patient_GID] int NOT NULL, 
     [Record_ID] int NOT NULL, 
     [SRVC_FROM_DTE] int NOT NULL 
) 
WITH (LOCATION='/applications/gidr/processing/lnd/sha/clm/cf/claim_diagnosis', 
     DATA_SOURCE = MyHadoopCluster, 
     FILE_FORMAT = TextFileFormat 

); 

,我得到這個錯誤:

EXTERNAL TABLE access failed due to internal error: 'Java exception raised on call to HdfsBridge_GetDirectoryFiles: Error [Permission denied: user=pdw_user, access=READ_EXECUTE, inode="/applications/gidr/processing/lnd/sha/clm/cf/claim_diagnosis":root:supergroup:drwxrwxr-- at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:175) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6590) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6572) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6497) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:5034) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:4995) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:882) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getListing(AuthorizationProviderProxyClientProtocol.java:335) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:615) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080) ] occurred while accessing external file.'

問題是,在最新版本的polybase中沒有配置文件,您可以在其中指定hadoop默認登錄名和密碼。所以即使在我創建範圍證書時,polybase仍然使用默認的pdw_user。我甚至試圖在hadoop上創建pdw_user,但仍然有這個錯誤。有任何想法嗎?

回答