Copying file from and to HDFS
In order to copy a file from the local file system to HDFS we can use the FS command: copyFromLocal
For an example:
1: % hadoop fs -copyFromLocal LabsVideos/Videos/VideoSample164.mpeg hdfs://VideosDataNode1/user/Lab1/VideoSample164.mpeg
In order to copy a file from the HDFS to local directory we can use the FS command :copyToLocal
For an example :
1: % hadoop fs -copyToLocal LabsVideos/BenchMarksResutls/Results080513.xls hdsf://DataResultsNode/user/Lab1/Results080513.xls
Accessing HDFS by code
The HDFS can be accessed an manipulate by several way :
1.Java interface
The new api :abstract file system and File context allow easy interface to access files in all notes in the cluster .
The main class in the new api is the context class
An example of the new api usage :
1: FileContext myFContext = FileContext.getFileContext(); // uses the default config the default FS2: //Set the working directory to other data node3: myFContext.setWorkingDir("hdsf://NotLocalDataResultsNode/user/Lab1/VideosList");4: //Opens an FSDataInputStream at the indicated Path.5: FSDataInputStream theFSDataInputStream = myFContext.open ("VideoSample2");6: //Read from the FSDataInputStream7: byte[] Databuffer = new byte[1000];8: theFSDataInputStream.read (0 ,Databuffer, 0 , 1000);9:10: //Use the theFSDataInputStream11:12: theFSDataInputStream.close();13:14: //Create new directory15: myFContext.create("NewVideoSamplesDir");
The namespace for the FileContext interface is: org.apache.hadoop.fs .Documentation about the FileContext interface can be found here.
2.Accessing HDFS using C LibHDFS library . Documentation can be found here
3.Accessing HDFS using http calls .
אין תגובות:
הוסף רשומת תגובה