找不到SPARK 1.4.0文件信任库的例外

时间:2015-07-08 07:28:04

标签: java hadoop ssl apache-spark kerberos

我正在使用带有hadoop-2.6.0的SPARK 1.4.0。我使用 spark.ssl.enabled 启用了ssl。在nodemanager日志中提交示例作业时会出现以下异常。

java.io.FileNotFoundException: C:\Spark\conf\spark.truststore (The system cannot find the path specified)

当我将信任库文件放在其他驱动器中时(比如说D :)然后得到以下异常

java.io.FileNotFoundException: D:\Spark_conf\spark.truststore (The device is not ready)

我已正确提到密钥库和信任库位置。我正在关注Spark configuration设置SSL和ACL

火花defaults.conf

spark.authenticate              true
spark.acls.enable               true
spark.admin.acls                Kumar
spark.modify.acls               Kumar
spark.ui.view.acls              Kumar
spark.ssl.enabled               true
spark.ssl.enabledAlgorithms     TLS_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_CBC_SHA
spark.ssl.keyPassword           password
spark.ssl.keyStore              C:/Spark/conf/spark.keystore
spark.ssl.keyStorePassword      password
spark.ssl.protocol              TLSv1
spark.ssl.trustStore            C:/Spark/conf/spark.truststore
spark.ssl.trustStorePassword    password

帮我解决问题。

1 个答案:

答案 0 :(得分:-1)

这可能是一个反斜杠问题。在错误中,它将文件路径显示为

C:\Spark\conf\spark.truststore 

并在配置文件中将其写为:

C:/Spark/conf/spark.keystore