The behaviour that you are seeing is expected, let me explain what's going on when you are working with hadoop fs
commands.
The command's syntax is this: hadoop fs -ls [path]
By default, when you don't specify [path]
for the above command, hadoop expands the path to /home/[username]
in hdfs; where [username]
gets replaced with linux username who is executing the command.
So, when you execute this command:
ubuntu@xad101-master:~$ hadoop fs -ls
the reason you are seeing the error is ls: '.': No such file or directory
because hadoop is looking for this path /home/ubuntu
, it seems like this path doesn't exist in hdfs.
The reason why this command:
ubuntu@101-master:~$ hadoop fs -ls hdfs://101-master:50000/
is working because, you have explicitly specified [path]
and is the root of the hdfs. You can also do the same using this:
ubuntu@101-master:~$ hadoop fs -ls /
which automatically gets evaluated to the root of hdfs.
Hope, this clears the behaviour you are seeing while executing hadoop fs -ls
command.
Hence, if you want to specify local file system path use file:///
url scheme.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…