我正在尝试使用以下命令执行spark脚本。
spark-submit --packages org.apache.spark:spark-streaming-kafka-0-10_2.11:2.3.0 src/sparkProcessing.py
我收到'未解决的依赖性错误,如下所示。
我正在使用Spark 2.3.0,Scala 2.12和Kafka 1.1.0
以下是我得到的错误:
:: modules in use:
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
:: problems summary ::
:::: WARNINGS
module not found: org.apache.spark#spark-streaming-kafka-0-10;2.3.0
http://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-streaming-kafka-0-10/2.3.0/spark-streaming-kafka-0-10-2.3.0.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.apache.spark#spark-streaming-kafka-0-10;2.3.0: not found
::::::::::::::::::::::::::::::::::::::::::::::
:::: ERRORS
Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-streaming-kafka-0-10/2.3.0/spark-streaming-kafka-0-10-2.3.0.pom (javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty)
Server access error at url https://repo1.maven.org/maven2/org/apache/spark/spark-streaming-kafka-0-10/2.3.0/spark-streaming-kafka-0-10-2.3.0.jar (javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty)
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.spark#spark-streaming-kafka-0-10;2.3.0: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1270)
at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:49)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:350)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
答案 0 :(得分:0)
使用https://stackoverflow.com/a/50688351/5808464
解决了这个问题我清除了其他java替代品并安装了我之前删除过的openjdk。