如何在Spark日志中隐藏密钥密码?

时间:2017-12-13 08:48:44

标签: apache-spark ssl

当我运行spark作业时,我可以看到SSL密钥密码,keystorepassword在事件日志中以纯文本形式显示。你能帮我解决一下如何从日志中隐藏这些密码。

我看下面https://issues.apache.org/jira/browse/SPARK-16796 似乎他们修复它以将其隐藏在Web UI中。但不确定我可以在日志

中修复它

非常感谢您的帮助!!

  

“{”Event“:”SparkListenerLogStart“,”Spark Version“:”2.1.1“}   {“Event”:“SparkListenerBlockManagerAdded”,“Block Manager ID”:{“Executor ID”:“driver”,“Host”:“xx.xxx.xx.xxx”,“Port”:43556},“Maximum Memory” :434031820, “时间戳”:1512750709305}   {“Event”:“SparkListenerEnvironmentUpdate”,“JVM Information”:{“Java Home”:“/ usr / lib / jvm / java-1.8.0-openjdk-1.8.0.141-1.b16.32.amzn1.x86_64 / jre“,”Java Version“:”1.8.0_141(Oracle Corporation)“,”Scala Version“:”version 2.11.8“},”Spark Properties“:{”spark.sql.warehouse.dir“:”hdfs: ///user/spark/warehouse","spark.yarn.dist.files":"file:/etc/spark/conf/hive-site.xml","spark.executor.extraJavaOptions":"-verbose:gc -XX:+ PrintGCDetails -XX:+ PrintGCDateStamps -XX:+ UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction = 70 -XX:MaxHeapFreeRatio = 70 -XX:+ CMSClassUnloadingEnabled -XX:OnOutOfMemoryError ='kill -9%p'“,”spark.driver 。主持人 “:” xx.xxx.xx.xxx”, “spark.serializer.objectStreamReset”: “100”, “spark.history.fs.logDirectory”: “HDFS:///无功/日志/火花/应用程序” “spark.eventLog.enabled”: “真”, “spark.driver.port”: “44832”, “spark.shuffle.service.enabled”: “真”, “spark.rdd.compress”: “真” “spark.driver.extraLibraryPath”: “/ usr / lib中/ hadoop的/ lib中的/天然的:/ usr / lib中/ Hadoop的LZO / LIB /天然的”, “spark.ssl.keyStore”:“的/ usr /共享/ AW S / EMR /安全/ CONF / keystore.jks “ ”spark.executorEnv.PYTHONPATH“: ”{{PWD}} / pyspark.zip {{PWD}} / py4j-0.10.4-src.zip“,” 火花.ssl.enabled “:” 真 “ ”spark.yarn.historyServer.address“:” ip-xx-xxx-xx-xxx.xxx.com:18080","spark.ssl.trustStore":"/usr/分享/ AWS / EMR /安全/ conf目录/ truststore.jks “ ”spark.app.name“: ”claim_line_fact_main“, ”spark.scheduler.mode“: ”先进先出“, ”spark.network.sasl.serverAlwaysEncrypt“:”真 “ ”spark.ssl.keyPassword“: ”XXXXXX“, ”spark.ssl.keyStorePassword“: ”XXXXXX“, ”spark.executor.id“: ”驱动程序“, ”spark.driver.extraJavaOptions“:” - XX :+ UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction = 70 -XX:MaxHeapFreeRatio = 70 -XX:+ CMSClassUnloadingEnabled -XX:OnOutOfMemoryError ='kill -9%p'“,”spark.submit.deployMode“:”client“,”spark.master “:” 纱 “ ”spark.authenticate.enableSaslEncryption“: ”真“, ”spark.authenticate“: ”真“, ”spark.ui.filters“:” org.apache.hadoop.yarn.server.webproxy.amfilter .AmIpFilter “ ”spark.executor.extraLibraryPath“: ”/ usr / lib中/ hadoop的/ lib中的/天然的:/ usr / lib中/ Hadoop的LZO / LIB /本地“,” 火花.sql.hive.metastore.sharedPrefixes “:” com.amazonaws.services.dynamodbv2" , “spark.executor.memory”: “5120M”, “spark.driver.extraClassPath”:“/ usr / lib中/ Hadoop的LZO / LIB / :/ usr / lib中/ hadoop的/ Hadoop的aws.jar:在/ usr /共享/ AWS / AWS-java的SDK / :在/ usr /共享/ AWS / EMR / emrfs / CONF:的/ usr /共享/ AWS / EMR / emrfs / LIB / :在/ usr /共享/ AWS / EMR / emrfs / auxlib / :在/ usr /共享/ AWS / EMR /安全/ CONF:在/ usr /share/aws/emr/security/lib/","spark.eventLog.dir":"hdfs:///var/log/spark/apps","spark.ssl.protocol":"TLSv1。 2" , “spark.dynamicAllocation.enabled”: “真”, “spark.executor.extraClassPath”:“/ usr / lib中/ Hadoop的LZO / LIB / :/ usr / lib中/ hadoop的/ Hadoop的AWS的.jar:在/ usr /共享/ AWS / AWS-java的SDK / :在/ usr /共享/ AWS / EMR / emrfs / CONF:在/ usr /共享/ AWS / EMR / emrfs / LIB / :在/ usr /共享/ AWS / EMR / emrfs / auxlib / :在/ usr /共享/ AWS / EMR /安全/ CONF:在/ usr /共享/ AWS / EMR /安全/ LIB / ”, “spark.executor.cores”: “4”, “spark.history.ui.port”: “18080”, “spark.driver.appUIAddress”: “HTTP://”, “spark.yarn.isPython”:”真 “ ”spark.ssl.trustStorePassword“: ”XXXXXX“,” spark.or g.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS “:” ip-xx-xxx-xx-xxx.xxx.com”, “spark.ssl.enabledAlgorithms”: “TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA” “spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES”:“

1 个答案:

答案 0 :(得分:0)

记录信息、警告、错误的消息只能由 log4j.properties 文件控制。如果你想隐藏密码或任何通过 -D 传递给 spark 的机密参数,可以通过删除 spark-submit 中的参数 --verbose 来隐藏。这对我有用