我正在尝试在SBT的Spark中运行nc wordcount程序, 并且我在登录我的Spark版本时遇到以下错误:-1.6.3 Scala版本是2.10.0
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] * commons-net:commons-net:2.2 is selected over 3.1
[warn] +- org.apache.spark:spark-core_2.10:1.6.3 (depends on 2.2)
[warn] +- org.apache.hadoop:hadoop-common:2.2.0 (depends on 3.1)
[warn] * com.google.guava:guava:14.0.1 is selected over 11.0.2
[warn] +- org.apache.curator:curator-recipes:2.4.0 (depends on 14.0.1)
[warn] +- org.tachyonproject:tachyon-client:0.8.2 (depends on 14.0.1)
[warn] +- org.apache.curator:curator-client:2.4.0 (depends on 14.0.1)
[warn] +- org.tachyonproject:tachyon-underfs-hdfs:0.8.2 (depends on 14.0.1)
[warn] +- org.apache.curator:curator-framework:2.4.0 (depends on 14.0.1)
[warn] +- org.tachyonproject:tachyon-underfs-s3:0.8.2 (depends on 14.0.1)
[warn] +- org.tachyonproject:tachyon-underfs-local:0.8.2 (depends on 14.0.1)
[warn] +- org.apache.hadoop:hadoop-hdfs:2.2.0 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-common:2.2.0 (depends on 11.0.2)
[warn] * com.google.code.findbugs:jsr305:1.3.9 is selected over 2.0.1
[warn] +- com.google.guava:guava:11.0.2 (depends on 1.3.9)
[warn] +- org.apache.spark:spark-core_2.10:1.6.3 (depends on 1.3.9)
[warn] +- org.apache.spark:spark-unsafe_2.10:1.6.3 (depends on 1.3.9)
[warn] +- org.apache.spark:spark-network-common_2.10:1.6.3 (depends on 1.3.9)
[warn] +- com.fasterxml.jackson.module:jackson-module-scala_2.10:2.4.4 (depends on 2.0.1)
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 1 Scala source to /home/training/Desktop/SBT/sbt/bin/sparknc/target/scala-2.10/classes ...
[error] /home/training/Desktop/SBT/sbt/bin/sparknc/src/main/scala/sparkstreaming.scala:2:8: object StreamingContext is not a member of package org.apache.spark
[error] import org.apache.spark.StreamingContext
[error] ^
[error] /home/training/Desktop/SBT/sbt/bin/sparknc/src/main/scala/sparkstreaming.scala:6:56: value setApplication is not a member of org.apache.spark.SparkConf
[error] val mysparkconf= new SparkConf().setMaster("local[2]").setApplication("My networking application")
[error] ^
[error] /home/training/Desktop/SBT/sbt/bin/sparknc/src/main/scala/sparkstreaming.scala:7:27: not found: type StreamingContext
[error] val streamingcontext= new StreamingContext(mysparkconf, seconds(2))
[error] ^
[错误]找到三个错误 [错误](Compile / compileIncremental)编译失败
答案 0 :(得分:0)
需要在您的build.sbt文件中添加火花流的精确依赖项。
将Scala版本设置为 2.10.5
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.3" % "provided"
仅建议使用最新的spark 2.3.1,并具有新功能,可以解决大多数依赖项
添加已解决问题的链接spark docs