异常java.lang.NoSuchMethodError:org.apache.spark.sql.SparkSession $ implicits $ .newSequenceEncoder

时间:2018-03-13 09:45:35

标签: scala apache-spark spark-dataframe nosuchmethoderror implicits

确切的信息是:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SparkSession$implicits$.newSequenceEncoder(Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/Encoder;
        at myMainCode$.main(myMainCode.scala:55)
        at myMainCode.main(myMainCode.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:751)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/03/13 09:57:41 INFO SparkContext: Invoking stop() from shutdown hook

它是从这样的代码中获得的:

val df = spark.read.text("<filepath>").
map(x=> UserDefinedJsonParser.parse(x.getString(0)) ).
map(x=> x.chained.map(y=> y)).
flatMap(x=>x).
toDF()

UserDefinedJsonParser是一个基于play.json的自定义类,它解码一个json字符串(x.getString(0))并返回一个case类,其中一个参数&#34;链接&#34;是List [CustomCaseClass]

它返回一个数据集[List [CustomCaseClass]],在flatMap给出数据集[CustomCaseClass]之后,在toDF()之后返回一个常规的DataFrame。

使用这些版本:

libraryDependencies += "org.scala-lang" % "scala-library" % "2.11.12"
libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.12"
libraryDependencies += "com.typesafe.play" %% "play-json" % "2.4.11"
libraryDependencies += "org.cvogt" %% "play-json-extensions" % "0.6.1"

我收到上述错误似乎与火花会话的转换有关。

spark-shell中的相同代码就像魅力一样。

编辑:

SPARK-17890上发现类似但不完全正确的问题。它很接近但不一样。 但是修复是一样的。使用RDD可以解决问题:

val df = spark.read.text("<filepath>").
map(x=> UserDefinedJsonParser.parse(x.getString(0)) ).
rdd.
map(x=> x.chained.map(y=> y)).
flatMap(x=>x).
toDF()

0 个答案:

没有答案