There are a vast range of commands supported by HDFS. If you are familier with linux commands it would be much easier for you.
Local file system is the file system of your own computer. Say if you are using windows, windows operating system will be having it's own way of managing files and folders. Same applies for linux.
We have also seen Hadoop has got it's own way of storing files and that is called HDFS. We have also seen, the data nodes are independent computers which gets the instruction from Name Node for storing the files. So if you think a little, a Name Node is actually using the Hadoop Distributed File System where as the Data Nodes uses it's own file system (i.e linux).
The two commands that helps us to interact with the HDFS are 'hadoop fs' and 'hdfs dfs'. The only difference is 'hdfs dfs' helps us to deal only with the HDFS file system and using 'hadoop fs' we can work with other file systems as well.
The 'mkdir' command is used to create a directory in HDFS.
A directory named 'newTestDir' is created under the root directory.
To copy the files from local file system to HDFS 'copyFromLocal' command is used.
The above command copies employee.csv from your local file system to the newly created directory 'newTestDir' in HDFS.
The above command does the same thing. i.e. Copies employee.csv from your local file system to the newly created directory 'newTestDir' in HDFS.
To copy the files from HDFS to local file system 'get' command is used.
The above command copies employee.csv from HDFS to your local file system.
To view a file in HDFS 'cat' command is used.
The above command displays the contents of employee.csv in HDFS.
To display the contents if a directory in HDFS 'ls' command is used.
The above command displays the contents of the directory 'newTestDir' in HDFS.
To delete files from HDFS 'rm' command is used.
The above command deletes all the files from HDFS.