我在HDFS中有一堆目录为:
stuff A: /stuff/prod/data/inputData
stuff B: /stuff/prod/data/global/holdingpen
stuff C: /stuff/prod/data/global/keepers
stuff D: /stuff/prod/data/global/actionDFiles
stuff E: /stuff/prod/data/global/expired
stuff H: /stuff/prod/data/global/actionHFiles
stuff L: /stuff/prod/data/global/billableSessionTopXRecompute
stuff R: /stuff/prod/data/global/billingIdTimeRecompute
stuff U: /stuff/prod/data/global/uniqueStats
stuff Z: /stuff/prod/data/global/cleanupOldVersions
目前,我只是对每个目录运行类似的代码
hadoop fs -ls /stuff/prod/data/global/actionHFiles | wc -l
20658
我希望能够在bash脚本中做到这一点,就像这样简单地打印出来
Stuff A: 183729
Stuff B: 281948
等有人可以帮忙吗?谢谢