线程“main”中的异常java.lang.RuntimeException:[unresolved dependency:com.datastax.spark#spark-cassandra-connector_2.10; 2.0.3:not found]

时间:2017-08-16 09:26:16

标签: scala apache-spark cassandra datastax

我在Centos 7机器上安装了:

  

spark version 2.2.0

     

scala版本2.11.8

     

java version 1.8.0.0_144

     

Cassandra 3.11.0

所以下一步是配置spark以通过 Spark Cassandra Connector 使用cassandra,当我尝试运行时问题

$SPARK_HOME/bin/spark-shell --packages datastax:spark-cassandra-connector:2.0.3-s_2.11

请注意我也试过了这个:

$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:2.0.3

我得到了:

...


                ::::::::::::::::::::::::::::::::::::::::::::::

                ::          UNRESOLVED DEPENDENCIES         ::

                ::::::::::::::::::::::::::::::::::::::::::::::

                :: com.datastax.spark#spark-cassandra-connector_2.10;2.0.3: not found

                ::::::::::::::::::::::::::::::::::::::::::::::


...

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.datastax.spark#spark-cassandra-connector_2.10;2.0.3: not found]
        at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1177)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:298)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我做错了什么?我注意到那些我正在使用的版本(对于scala,spark和cassandra)没有出现在这里的兼容性版本中spark-cassandra-connector github website

0 个答案:

没有答案