使用from_json函数在Spark Structured Streaming中获取错误

时间:2018-04-03 18:02:42

标签: scala apache-spark spark-structured-streaming

我正在读取Kafka的数据,我的火花代码包含以下代码:

val hiveDf = parsedDf
    .select(from_json(col("value"), schema).as("value"))
    .selectExpr("value.*")

当我从IntelliJ运行它时,它正在工作但是当我以jar运行时它会抛出以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.functions$.from_json(Lorg/apache/spark/sql/Column;Lorg/apache/spark/sql/types/StructType;)Lorg/apache/spark/sql/Column;

我的spark-submit命令如下所示:

C:\spark>.\bin\spark-submit --jars C:\Users\namaagarwal\Desktop\Spark_FI\spark-sql-kafka-0-10_2.11-2.1.0.cloudera1.jar --class ClickStream C:\Users\namaagarwal\Desktop\Spark_FI\SparkStreamingFI\target\scala-2.11\sparkstreamingfi_2.11-0.1.jar

0 个答案:

没有答案