2014-10-09 67 views
0

最近我們已經升級到5.1.3 CDH紗&,我們得到以下錯誤在MapReduce工作的AccessControlException紗線MapReduce工作

at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)        [1829/1922] 
     at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1140) 
     at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1128) 
     at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1118) 
     at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:264) 
     at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:231) 
     at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:224) 
     at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1291) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:300) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:296) 
     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:296) 
     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764) 
     at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.<init>(JobHistoryParser.java:86) 
     at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:335) 
     ... 22 more 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=mapred, ac 
cess=READ, inode="/user/history/done_intermediate/abc/job_1412716537481_0426-1412782860181-abc-PigLatin%3ACategory+li 
ft+for+pixels%3A9259-1412782882528-1-1-SUCCEEDED-root.abc-1412782867082.jhist":abc:supergroup:-rwxrwx--- 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:185) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5607) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5589) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:5551) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1717) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1669) 
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1717)   [1804/1922] 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1669) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1649) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1621) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:482) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServe 
rSideTranslatorPB.java:322) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenod 
eProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1026) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1986) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1982) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformatio 

任何解決這一問題

,我們應該有哪些權限/用戶/ * & /用戶/歷史記錄/ *

回答

0

刪除/ user/history/done_intermediate/*後已解決