如何枚举HDFS目录中的文件

时间:2016-06-19 02:33:09

标签: scala hadoop apache-spark hdfs

如何枚举HDFS目录中的文件?这是使用Scala枚举Apache Spark集群中的文件。我看到有sc.textfile()选项,但是它也会读取内容。我想只读文件名。

我实际上尝试过listStatus。但没有奏效。得到以下错误。 我使用的是Azure HDInsight Spark,blob商店文件夹“testContainer@testhdi.blob.core.windows.net/example/”包含.json文件。

val fs = FileSystem.get(new Configuration())
val status = fs.listStatus(new Path("wasb://testContainer@testhdi.blob.core.windows.net/example/"))
status.foreach(x=> println(x.getPath)

=========
Error:
========
java.io.FileNotFoundException: Filewasb://testContainer@testhdi.blob.core.windows.net/example does not exist.
    at org.apache.hadoop.fs.azure.NativeAzureFileSystem.listStatus(NativeAzureFileSystem.java:2076)
    at $iwC$$iwC$$iwC$$iwC.<init>(<console>:23)
    at $iwC$$iwC$$iwC.<init>(<console>:28)
    at $iwC$$iwC.<init>(<console>:30)
    at $iwC.<init>(<console>:32)
    at <init>(<console>:34)
    at .<init>(<console>:38)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at com.cloudera.livy.repl.scalaRepl.SparkInterpreter$$anonfun$executeLine$1.apply(SparkInterpreter.scala:272)
    at com.cloudera.livy.repl.scalaRepl.SparkInterpreter$$anonfun$executeLine$1.apply(SparkInterpreter.scala:272)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
    at scala.Console$.withOut(Console.scala:126)
    at com.cloudera.livy.repl.scalaRepl.SparkInterpreter.executeLine(SparkInterpreter.scala:271)
    at com.cloudera.livy.repl.scalaRepl.SparkInterpreter.executeLines(SparkInterpreter.scala:246)
    at com.cloudera.livy.repl.scalaRepl.SparkInterpreter.execute(SparkInterpreter.scala:104)
    at com.cloudera.livy.repl.Session.com$cloudera$livy$repl$Session$$executeCode(Session.scala:98)
    at com.cloudera.livy.repl.Session$$anonfun$3.apply(Session.scala:73)
    at com.cloudera.livy.repl.Session$$anonfun$3.apply(Session.scala:73)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

谢谢!

2 个答案:

答案 0 :(得分:3)

这是失败的原因是因为它实际上是在查找默认存储容器而不是testContainer,因此找不到示例文件夹。您可以通过更改wasb://testContainer@testhdi.blob.core.windows.net/的路径来查看此信息,它将列出来自不同容器的文件。

我不知道为什么会这样,但我发现你可以通过将路径传递给FileSystem.get调用来修复它:

spin.post(new Runnable() {
    @Override
    public void run() {
        spin.setSelection(position);
    }
});

答案 1 :(得分:2)

see FileSystem class

  

abstract FileStatus [] listStatus(Path f)

     

列出给定路径中文件/目录的状态   path是一个目录。

val fs = FileSystem.get(new Configuration())
val status = fs.listStatus(new Path(HDFS_PATH))
status.foreach(x=> println(x.getPath)

注意:你可以从java或scala之类的任何语言访问HDFS api以下是java示例

/**
     * Method listFileStats.
     * 
     * @param destination
     * @param fs
     * @throws FileNotFoundException
     * @throws IOException
     */
    public static void listFileStats(final String destination, final FileSystem fs) throws FileNotFoundException, IOException {
        final FileStatus[] statuss = fs.listStatus(new Path(destination));
        for (final FileStatus status : statuss) {
            LOG.info("--  status {}    ", status.toString());
            LOG.info("Human readable size {} of file ", FileUtils.byteCountToDisplaySize(status.getLen())); //import org.apache.commons.io.FileUtils;
        }
    }
}