有没有办法以GB或MB计算HDFS文件目录的总文件大小?我不想使用du命令。没有它就有办法
目录 - / test / my_dir
答案 0 :(得分:1)
您可以使用df
或report
,'hadoop fs -count -q -h
它会显示总大小
[root@hadoop0 ~]# hadoop fs -df -h /
Filesystem Size Used Available Use%
hdfs://hadoop0:8020 119.9 G 27.8 G 62.3 G 23%
[root@hadoop0 ~]# hadoop dfsadmin -report
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Configured Capacity: 128770375680 (119.93 GB)
Present Capacity: 96752292952 (90.11 GB)
DFS Remaining: 66886767274 (62.29 GB)
DFS Used: 29865525678 (27.81 GB)
DFS Used%: 30.87%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
Missing blocks (with replication factor 1): 0