java.io.InvalidClassException:本地类名与流类名不兼容" void"

时间:2016-07-15 14:28:08

标签: java serialization apache-spark apache-spark-sql

在尝试将spark sql数据帧写入jdbc时,我得到以下异常:

  at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
  at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
  at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:918)
  at org.apache.spark.sql.DataFrame$$anonfun$foreachPartition$1.apply$mcV$sp(DataFrame.scala:1444)
  at org.apache.spark.sql.DataFrame$$anonfun$foreachPartition$1.apply(DataFrame.scala:1444)
  at org.apache.spark.sql.DataFrame$$anonfun$foreachPartition$1.apply(DataFrame.scala:1444)
  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:56)
  at org.apache.spark.sql.DataFrame.withNewExecutionId(DataFrame.scala:2086)
  at org.apache.spark.sql.DataFrame.foreachPartition(DataFrame.scala:1443)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.saveTable(JdbcUtils.scala:276)
  at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:311)
  at .....
Caused by: java.io.InvalidClassException: java.lang.Void; local class name incompatible with stream class name "void"

这是保存数据框的方式

dataFrame.write.mode(SaveMode.Append).jdbc(url, tableStatement, new Properties)

首先,这种异常是通过让函数返回某个值来解决的,但现在它已经超出了我们的控制范围。 有没有人遇到过这个?

0 个答案:

没有答案