Announcement

Collapse
No announcement yet.

No such file or directory error when doing: 'hdfs dfs -mkdir ~/test'

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • No such file or directory error when doing: 'hdfs dfs -mkdir ~/test'


    Hadoop 2.7.3 is installed and seems to be running normally, but I get the above error regardless of where I try to create the directory.

    Permissions are set for full read/write/executable under the same account that I'm running hadoop. Even creating the folder outside of hadoop, just using the terminal, I still get the error.

    Any ideas why this command is failing? Is there anything I can check to see what could be the problem? Running hdfs dfsadmin -report doesn't seem to show anything interesting:




    Configured Capacity: 38916653056 (36.24 GB)
    Present Capacity: 3312762880 (3.09 GB)
    DFS Remaining: 3312730112 (3.09 GB)
    DFS Used: 32768 (32 KB)
    DFS Used%: 0.00%
    Under replicated blocks: 0
    Blocks with corrupt replicas: 0
    Missing blocks: 0
    Missing blocks (with replication factor 1): 0

    -------------------------------------------------
    Live datanodes (1):

    Name: 127.0.0.1:50010 (localhost)
    Hostname: debian.debiandomain
    Decommission Status : Normal
    Configured Capacity: 38916653056 (36.24 GB)
    DFS Used: 32768 (32 KB)
    Non DFS Used: 35603890176 (33.16 GB)
    DFS Remaining: 3312730112 (3.09 GB)
    DFS Used%: 0.00%
    DFS Remaining%: 8.51%
    Configured Cache Capacity: 0 (0 B)
    Cache Used: 0 (0 B)
    Cache Remaining: 0 (0 B)
    Cache Used%: 100.00%
    Cache Remaining%: 0.00%
    Xceivers: 1
    Last contact: Sun Jan 08 11:45:19 PST 2017

  • #2
    may be you should try removing "~" and see if it works

    Comment

    Working...
    X