NoClassDefFoundError:scala / Product $ class

时间:2017-06-06 10:20:34

标签: java scala maven apache-spark

我是scala的新手,我正在尝试使用Scala和Java创建一个混合项目。 但是,当我运行测试代码时,我遇到了一些问题。当我运行测试时,我收到错误

[]

和我的pom.xml如下:

<properties>
    <scala.version>2.12.2</scala.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-compiler</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-reflect</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.1.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
        <version>2.1.1</version>
    </dependency>
</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.scala-tools</groupId>
            <artifactId>maven-scala-plugin</artifactId>
            <version>2.15.2</version>
            <executions>
                <execution>
                    <id>compile</id>
                    <goals>
                        <goal>compile</goal>
                    </goals>
                    <phase>compile</phase>
                </execution>
                <execution>
                    <id>test-compile</id>
                    <goals>
                        <goal>testCompile</goal>
                    </goals>
                    <phase>test-compile</phase>
                </execution>
                <execution>
                    <phase>process-resources</phase>
                    <goals>
                        <goal>compile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
        <plugin>
            <artifactId>maven-compiler-plugin</artifactId>
            <configuration>
                <source>1.5</source>
                <target>1.5</target>
            </configuration>
        </plugin>
    </plugins>
</build>

我的代码如下:

class BptConsumer {

def consumeLogevent(): Unit ={
  val conf = new SparkConf().setMaster("local[2]").setAppName("PVStatistics");
  val ssc = new StreamingContext(conf,Seconds(5));

  val kafkaParams = Map[String, Object](
    "bootstrap.servers" -> "172.20.13.196:9092",
    "key.deserializer" -> classOf[StringDeserializer],
    "value.deserializer" -> classOf[StringDeserializer],
    "group.id" -> "1",
    "auto.offset.reset" -> "latest",
    "enable.auto.commit" -> (false: java.lang.Boolean)
  )

  val topics = Array("fd-blogs-tst")

  val stream = KafkaUtils.createDirectStream[String, String](
    ssc,
    PreferConsistent,
    Subscribe[String, String](topics, kafkaParams)
  )
  /*val rdd = stream.transform(x=>RDD[String]);*/
  val lines = stream.map(record => (record.key,record.value))

  lines.print();
  ssc.start();
  ssc.awaitTermination();
}
}

有人可以帮助我找到问题吗?

2 个答案:

答案 0 :(得分:15)

您正在将Scala 2.12.2与使用Scala 2.11构建的Spark库一起使用。将Scala版本更改为2.11版本:

<properties>
    <scala.version>2.11.11</scala.version>
</properties>

答案 1 :(得分:0)

模块依赖使用maven中的排除来清理scala-2.11.jars