尝试使用Generic Record作为值连接两个元组时,Spark Job中止

时间:2017-06-21 15:52:54

标签: scala apache-spark left-join spark-streaming

我试图在两个Dstream上做左外连接,两者都是[k:String,V:GenericData.Record]类型,但是收到错误

User class threw exception: org.apache.spark.SparkException: Job aborted due to stage failure: Task 2.0 in stage 0.0 (TID 2) had a not serializable result: org.apache.avro.generic.GenericData$Record

0 个答案:

没有答案