如何为独立群集非hdfs模式启用spark-history服务器

时间:2017-06-29 21:08:14

标签: apache-spark pyspark

我在独立模式下http://paxcel.net/blog/how-to-setup-apache-spark-standalone-cluster-on-multiple-machine/之后设置了Spark2.1.1群集(1个主2个从属)。 我没有在机器上进行预先Hadoop设置。我想启动spark-history服务器。 我运行如下:

roshan@bolt:~/spark/spark_home/sbin$ ./start-history-server.sh

并在spark-defaults.conf中我设置了这个:

spark.eventLog.enabled           true

但它失败并出现错误:

7/06/29 22:59:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(roshan); groups with view permissions: Set(); users  with modify permissions: Set(roshan); groups with modify permissions: Set()
17/06/29 22:59:03 INFO FsHistoryProvider: History server ui acls disabled; users with admin permissions: ; groups with admin permissions
Exception in thread "main" java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:278)
    at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.io.FileNotFoundException: Log directory specified does not exist: file:/tmp/spark-events Did you configure the correct one through spark.history.fs.logDirectory?
    at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:214)

我应该设置为spark.history.fs.logDirectoryspark.eventLog.dir

更新1:

spark.eventLog.enabled           true
spark.history.fs.logDirectory   file:////home/roshan/spark/spark_home/logs
spark.eventLog.dir               file:////home/roshan/spark/spark_home/logs

但我总是收到此错误:

java.lang.IllegalArgumentException: Codec [1] is not available. Consider setting spark.io.compression.codec=snappy at org.apache.spark.io.Co

2 个答案:

答案 0 :(得分:3)

默认情况下,spark将file:/tmp/spark-events定义为历史记录服务器的日志目录,并且您的日志清楚地说明 spark.history.fs.logDirectory 未配置

首先,您需要在 / tmp 中创建 spark-events 文件夹(由于 / tmp 已刷新,这不是一个好主意每次重新启动计算机时,然后在 spark-defaults.conf 中添加 spark.history.fs.logDirectory 以指向该目录。但我建议你创建另一个文件夹,使用户可以访问并更新 spark-defaults.conf 文件。

您需要在 spark-defaults.conf 文件

中再定义两个变量
spark.eventLog.dir              file:path to where you want to store your logs
spark.history.fs.logDirectory   file:same path as above

假设您要存储在 / opt / spark-events 中, spark用户可以访问 spark-defaults.conf 将是

spark.eventLog.enabled          true
spark.eventLog.dir              file:/opt/spark-events
spark.history.fs.logDirectory   file:/opt/spark-events

您可以在Monitoring and Instrumentation

中找到更多信息

答案 1 :(得分:-2)

尝试设置

spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec 
spark-defaults.conf

中的