如何在Apache Spark中使用历史服务器

时间:2017-04-16 21:53:03

标签: apache-spark

我使用大学中的以下代码段在Apache Spark中记录了我的一个工作。

  conf = SparkConf()\
         .setAppName("Ex").set("spark.eventLog.enabled", "true")\
         .set("spark.eventLog.dir", "log")

作业完成后,我尝试复制日志文件app-20170416171823-0000。在我的本地系统上,并尝试按照以下命令浏览记录的Spark Web UI。

sbin/start-history-server.sh ~/Downloads/log/app-20170416171823-0000

但历史记录因以下错误而终止:

failed to launch: nice -n 0 /usr/local/Cellar/apache-spark/2.1.0/libexec/bin/spark-class org.apache.spark.deploy.history.HistoryServer /Users/sk/Downloads/log/app-20170416171823-0000      at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:77)     ... 6 more full log in /usr/local/Cellar/apache-spark/2.1.0/libexec/logs/spark-sk-org.apa

che.spark.deploy.history.HistoryServer-1-SK-的MacBook-Pro.local.out

历史服务器输出的内容:

17/04/16 17:44:52 INFO SecurityManager: Changing modify acls groups to: 
17/04/16 17:44:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(skaran); groups with view permissions: Set(); users  with modify permissions: Set(skaran); groups with modify permissions: Set()
Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:278)
    at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.lang.IllegalArgumentException: Logging directory specified is not a directory: file:/Users/sk/Downloads/log/app-20170416171823-0000
    at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:198)
    at org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:153)
    at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:149)
    at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:77)
    ... 6 more

1 个答案:

答案 0 :(得分:0)

似乎参数应该是包含日志的文件夹。