为什么带有spark-streaming-kinesis-asl依赖性的“ java.lang.ClassNotFoundException:无法找到数据源:运动”?

时间:2018-11-29 08:09:01

标签: scala apache-spark amazon-kinesis spark-structured-streaming

我的设置:

  scala:2.11.8
  spark:2.3.0.cloudera4

我已经将其添加到我的.pom文件中:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming-kinesis-asl_2.11</artifactId>
  <version>2.3.0</version>
</dependency>

但是,当我运行我的spark-streaming代码以使用kinesis中的数据时,它返回:

Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: kinesis.

当我使用Kafka中的数据并通过在Submit命令中指示相关的jar来解决该错误时,我遇到了类似的错误。但这一次似乎不起作用:

sudo -u hdfs spark2-submit --packages org.apache.spark:spark-streaming-kinesis-asl_2.11:2.3.0 --class com.package.newkinesis --master yarn  sparktest-1.0-SNAPSHOT.jar 

如何解决此问题?任何帮助表示赞赏。

我的代码:

val spark = SparkSession
      .builder.master("local[4]")
      .appName("SpeedTester")
      .config("spark.driver.memory", "3g")
      .getOrCreate()

    val kinesis = spark.readStream
      .format("kinesis")
      .option("streamName", kinesisStreamName)
      .option("endpointUrl", kinesisEndpointUrl)
      .option("initialPosition", "TRIM_HORIZON")
      .option("awsAccessKey", awsAccessKeyId)
      .option("awsSecretKey", awsSecretKey)
      .load()

    kinesis.writeStream.format("console").start().awaitTermination()

我的完整.pom文件:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.netease</groupId>
  <artifactId>sparktest</artifactId>
  <version>1.0-SNAPSHOT</version>
  <inceptionYear>2008</inceptionYear>
  <properties>
    <scala.version>2.11.8</scala.version>
  </properties>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.2.1</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <includes>
                                <include>org/apache/spark/*</include>
                            </includes>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

  <dependencies>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.11</artifactId>
        <scope>provided</scope>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_2.11</artifactId>
        <scope>provided</scope>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.11</artifactId>
        <scope>provided</scope>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
      <version>2.3.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.kafka</groupId>
      <artifactId>kafka-clients</artifactId>
      <version>2.1.0</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kinesis-asl_2.11</artifactId>
      <version>2.3.0</version>
    </dependency>
  </dependencies>
</project>

1 个答案:

答案 0 :(得分:0)

tl; dr 。它将无法正常工作。

您将&依赖关系用于旧的Spark Streaming API和新的Spark Structured Streaming,因此是例外。

您必须找到适用于AWS Kinesis的兼容Spark结构化流数据源,Apache Spark项目并未正式支持该数据源。