无法使用spark-cassandra-connector启动spark-shell

时间:2018-01-05 11:16:47

标签: apache-spark

我已经构建了一个包含3个节点的cassandra集群,并在其上安装了火花集群。

使用以下脚本在一个VM上启动spark-shell,它无法启动。

spark-shell -v --master spark://storm.c.gcp20170324.internal:7077 --packages datastax:spark-cassandra-connector:2.0.6-s_2.11 --conf spark.cassandra.connection.host=10.128.0.4 --conf spark.cassandra.read.timeout_ms=2400000 --conf spark.cassandra.query.retry.count=600 --conf spark.cassandra.connection.timeout_ms=50000 --conf spark.cassandra.input.split.size_in_mb=67108864 --conf spark.network.timeout=600s --conf spark.executor.heartbeatInterval=100s

收到以下错误:

Ivy Default Cache set to: /home/nmj/.ivy2/cache
The jars for the packages stored in: /home/nmj/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-2.1.2-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
datastax#spark-cassandra-connector added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found datastax#spark-cassandra-connector;2.0.6-s_2.11 in spark-packages
        found commons-beanutils#commons-beanutils;1.9.3 in local-m2-cache
        found commons-collections#commons-collections;3.2.2 in local-m2-cache
        found org.joda#joda-convert;1.2 in local-m2-cache
        found joda-time#joda-time;2.3 in local-m2-cache
        found io.netty#netty-all;4.0.33.Final in local-m2-cache
        found com.twitter#jsr166e;1.1.0 in local-m2-cache
        found org.scala-lang#scala-reflect;2.11.8 in local-m2-cache
:: resolution report :: resolve 906ms :: artifacts dl 27ms
        :: modules in use:
        com.twitter#jsr166e;1.1.0 from local-m2-cache in [default]
        commons-beanutils#commons-beanutils;1.9.3 from local-m2-cache in [default]
        commons-collections#commons-collections;3.2.2 from local-m2-cache in [default]
        datastax#spark-cassandra-connector;2.0.6-s_2.11 from spark-packages in [default]
        io.netty#netty-all;4.0.33.Final from local-m2-cache in [default]
        joda-time#joda-time;2.3 from local-m2-cache in [default]
        org.joda#joda-convert;1.2 from local-m2-cache in [default]
        org.scala-lang#scala-reflect;2.11.8 from local-m2-cache in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   8   |   0   |   0   |   0   ||   8   |   0   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
                [NOT FOUND  ] io.netty#netty-all;4.0.33.Final!netty-all.jar (2ms)

        ==== local-m2-cache: tried

          file:/home/nmj/.m2/repository/io/netty/netty-all/4.0.33.Final/netty-all-4.0.33.Final.jar

                ::::::::::::::::::::::::::::::::::::::::::::::

                ::              FAILED DOWNLOADS            ::

                :: ^ see resolution messages for details  ^ ::

                ::::::::::::::::::::::::::::::::::::::::::::::

                :: io.netty#netty-all;4.0.33.Final!netty-all.jar

                ::::::::::::::::::::::::::::::::::::::::::::::



:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [download failed: io.netty#netty-all;4.0.33.Final!netty-all.jar]
        at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1084)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:296)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:160)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

在其他虚拟机上,它已成功启动。

日志的不同之处在于找到成功的VM依赖项in central而不是in local-m2-cache

1 个答案:

答案 0 :(得分:1)

正如@JacekLaskowski所说,删除目录/home/nmj/.m2home/nmj/.ivy2会有效。