Hadoop File System Commands

By Eric Downing. Filed in BigData  |  
TOP del.icio.us digg

When working with Hadoop you might need to move files in and out of the HDFS. The hadoop fs commands can get tedious to type out over and over again, so I came up with some aliases that work to reduce my typing.

The commands that I have found myself using most often are:

Listing the directories and files:

hadoop fs -ls <filename(s)>

hls <filename(s)>

Viewing the contents of files:

hadoop fs -cat <filename(s)>

hcat <filename(s)>

I usually pipe that into an awk script to verify data.

Removing Entries from the HDFS:

hadoop fs -rm <filename(s)>

hrm <filename(s)>

Copying files to HDFS:

hadoop fs -copyFromLocal <local filename> <target directory or filename>

hcpto <local filename> <target directory or filename>

Copying from HDFS to the local system:

hadoop fs -copyToLocal <remote file(s)> <target directory or filename>

hcpfrom <remote file(s)> <target directory or filename>

Note: The <filename(s)> should be replaced with whatever files or directories including using the ‘*’ to select multiple files. Only the copy commands should have a singular entry for the target directory.

Here are the alias entries I use:

alias h='hadoop fs'
alias hcat='hadoop fs -cat '
alias hls='hadoop fs -ls '
alias hrm='hadoop fs -rm '
alias hcpto='hadoop fs -copytoLocal'
alias hcpfrom = 'hadoop fs -copyFromLocal'

Tags:

Leave a Reply