2012-07-13 211 views
7

我在嘗試將文件從HDFS文件系統「下載」到本地系統時出現問題。 (即使相反的操作沒有問題)。 *注:文件的HDFS文件系統上存在指定路徑將文件從HDFS複製到本地計算機

這裏上的代碼片段:

Configuration conf = new Configuration(); 
    conf.set("fs.defaultFS", "${NAMENODE_URI}"); 
    FileSystem hdfsFileSystem = FileSystem.get(conf); 

    String result = ""; 

    Path local = new Path("${SOME_LOCAL_PATH}"); 
    Path hdfs = new Path("${SOME_HDFS_PATH}"); 

    String fileName = hdfs.getName(); 

    if (hdfsFileSystem.exists(hdfs)) 
    { 
     hdfsFileSystem.copyToLocalFile(hdfs, local); 
     result = "File " + fileName + " copied to local machine on location: " + localPath; 
    } 
    else 
    { 
     result = "File " + fileName + " does not exist on HDFS on location: " + localPath; 
    } 

    return result; 

例外,我得到的是以下幾點:

12/07/13 14:57:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 
Exception in thread "main" java.io.IOException: Cannot run program "cygpath": CreateProcess error=2, The system cannot find the file specified 
    at java.lang.ProcessBuilder.start(Unknown Source) 
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:206) 
    at org.apache.hadoop.util.Shell.run(Shell.java:188) 
    at org.apache.hadoop.fs.FileUtil$CygPathCommand.<init>(FileUtil.java:412) 
    at org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:438) 
    at org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:465) 
    at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:573) 
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:565) 
    at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:403) 
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:452) 
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:420) 
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:774) 
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:755) 
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:654) 
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:259) 
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:232) 
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:183) 
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1837) 
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1806) 
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1782) 
    at com.hmeter.hadoop.hdfs.hdfsoperations.HdfsOperations.fileCopyFromHdfsToLocal(HdfsOperations.java:75) 
    at com.hmeter.hadoop.hdfs.hdfsoperations.HdfsOperations.main(HdfsOperations.java:148) 
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified 
    at java.lang.ProcessImpl.create(Native Method) 
    at java.lang.ProcessImpl.<init>(Unknown Source) 
    at java.lang.ProcessImpl.start(Unknown Source) 
    ... 22 more 

任何想法可能是什麼一個問題?爲什麼它需要Cygwin的cyqpath?我在Windows上運行此代碼7.

感謝

回答

9

嘗試使用這種方法從API:與

hdfsFileSystem.copyToLocalFile(hdfs, local); 

//where delSrc is do you want to delete the source, src and dst you already have and useRawLocalFileSystem should be set to true in your case 
hdfsFileSystem.copyToLocalFile(delSrc, src, dst, useRawLocalFileSystem); 
你的情況

更換

hdfsFileSystem.copyToLocalFile(false, hdfs, local, true); 
+0

它通過oozie提交工作時是否工作? – Abhinay 2016-01-07 09:36:24

+0

@Abhinay我不知道沒有這個工作,不再抱歉 – ant 2016-01-07 16:56:33

6

您可以按照下面顯示的代碼:

public static void main(String args[]){ 
    try { 
     Configuration conf = new Configuration(); 
     conf.set("fs.defaultFS", "hdfs://localhost:54310/user/hadoop/"); 
     FileSystem fs = FileSystem.get(conf); 
     FileStatus[] status = fs.listStatus(new Path("hdfsdirectory")); 
     for(int i=0;i<status.length;i++){ 
      System.out.println(status[i].getPath()); 
      fs.copyToLocalFile(false, status[i].getPath(), new Path("localdir")); 
     } 
    } catch (IOException e) { 
     e.printStackTrace(); 
    } 

} 
相關問題