SPARK HBASE问题

时间:2017-10-05 11:29:10

标签: apache-spark hbase

由于跟随错误而停止了Spark作业,并且仅在多次尝试后才启动:

Caused by: java.io.InvalidClassException: org.apache.hadoop.hbase.spark.HBaseContext; local class incompatible: stream classdesc serialVersionUID = -5686505108908438419, local class serialVersionUID = -6879194698097628128
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)

有人请帮忙。

0 个答案:

没有答案