无法从SparkStreaming中读取Kinesis流

时间:2018-03-05 21:10:14

标签: apache-spark spark-streaming amazon-kinesis

import org.apache.spark.SparkConf
import org.apache.spark.storage.StorageLevel
import org.apache.spark.streaming.Milliseconds
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.dstream.DStream.toPairDStreamFunctions

import com.amazonaws.auth.AWSCredentials
import com.amazonaws.auth.DefaultAWSCredentialsProviderChain
import com.amazonaws.auth.SystemPropertiesCredentialsProvider
import com.amazonaws.services.kinesis.AmazonKinesisClient
import com.amazonaws.services.kinesis.clientlibrary.lib.worker.InitialPositionInStream
import org.apache.spark.streaming.kinesis.KinesisInputDStream
import org.apache.spark.streaming.kinesis.KinesisInitialPositions.Latest
import org.apache.spark.streaming.kinesis.KinesisInitialPositions.TrimHorizon
import java.util.Date



    val tStream = KinesisInputDStream.builder
            .streamingContext(ssc)
            .streamName(streamName)
            .endpointUrl(endpointUrl)
            .regionName(regionName)
            .initialPosition(new TrimHorizon())
            .checkpointAppName(appName)
            .checkpointInterval(kinesisCheckpointInterval)
            .storageLevel(StorageLevel.MEMORY_AND_DISK_2)
            .build()
    tStream.foreachRDD(rdd => if (rdd.count() > 0) rdd.saveAsTextFile("/user/hdfs/test/") else println("No record to read"))

在这里,即使我看到数据流入流,我上面的火花工作也没有得到任何记录。我确信我正在使用所有凭据连接到正确的流。 请帮帮我。

0 个答案:

没有答案