对象流不是包org.apache.spark的成员

时间:2016-04-05 08:55:35

标签: scala apache-spark

我正在尝试编译一个简单的scala程序并且我使用StreamingContext,这是我的代码片段:

function showOffer(id_offer) {
    $.ajax({
      url: '<?php echo base_url('admin/order/getOffers'); ?>',
     type:'GET',
      data: {},
      success: function(data){
                 publish('dataChanged', data);
            }
      }
    });
}

subscribe('dataChanged', function(data)) {
       var offers = data.offers;

       for (int i = 0 ; i < offers.length ; i++) {
             // update cell in table
       }
});

我有这两个错误:

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.scheduler.SparkListener
import org.apache.spark.scheduler.SparkListenerStageCompleted
import org.apache.spark.streaming.StreamingContext._ //error:object streaming is not a member of package org.apache.spark
object FileCount {
    def main(args: Array[String]) {
    val conf = new SparkConf()
    .setAppName("File Count")
    .setMaster("local")

    val sc = new SparkContext(conf)
    val textFile = sc.textFile(args(0))
    val ssc = new StreamingContext(sc, Seconds(10)) //error : not found: type StreamingContext
    sc.stop()    
  }
}

object streaming is not a member of package org.apache.spark

任何帮助请!!

3 个答案:

答案 0 :(得分:1)

您需要将spark-streaming的依赖项添加到构建管理器中。

答案 1 :(得分:1)

如果您使用 sbt ,请添加以下库依赖项:

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.1.0" % "provided"

如果您使用 maven ,请将以下内容添加到pom.xml

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.11</artifactId>
    <version>2.1.0</version>
    <scope>provided</scope>
</dependency>

答案 2 :(得分:0)

我已经添加了缺失的依赖项,因为它的工作对我来说是

"org.apache.spark" %% "spark-mllib" % SparkVersion,
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.0.1"