我有两个hbase表的数据,需要从中获取连接结果。
获得加入结果的最佳方法是什么? 我尝试使用RDD加入,但它给了我错误。 我收到以下错误。
对象不可序列化(类:org.apache.hadoop.hbase.client.Result
val hbaseConf = HBaseConfiguration.create();
hbaseConf.set("hbase.zookeeper.quorum", "localhost")
hbaseConf.set(TableInputFormat.INPUT_TABLE, "table1")
val table1RDD = sc.newAPIHadoopRDD(hbaseConf, classOf[TableInputFormat], classOf[ImmutableBytesWritable], classOf[Result]).persist(StorageLevel.MEMORY_AND_DISK)
val table1Data = filteredRouters.map( {case(rowkey:ImmutableBytesWritable, values:Result) => (Bytes.toString(values.getValue(Bytes.toBytes("cf"), Bytes.toBytes("col1"))), values) }).persist(StorageLevel.MEMORY_AND_DISK)
//-------------//
hbaseConf.set(TableInputFormat.INPUT_TABLE, "interface")
val table2RDD = sc.newAPIHadoopRDD(hbaseConf, classOf[TableInputFormat], classOf[ImmutableBytesWritable], classOf[Result]).persist(StorageLevel.MEMORY_AND_DISK)
val table2Data = loopBacks.map( {case(rowkey:ImmutableBytesWritable, values:Result) => (Bytes.toString(values.getValue(Bytes.toBytes("cf1"), Bytes.toBytes("col1"))), values) }).persist(StorageLevel.MEMORY_AND_DISK)
interfaceData.foreach({case(key:String, values:Result) => {println("---> key is " + key)}})
// Got the table data //
val joinedRDD = routerData.join(interfaceData).persist(StorageLevel.MEMORY_AND_DISK);
joinedRDD.foreach({case((key:String, results: (Result, Result))) =>
{
println(" key is " + key);
println(" value is ");
}
}
)
堆栈跟踪:
16/02/09 11:21:21 ERROR TaskSetManager: Task 0.0 in stage 6.0 (TID 6) had a not serializable result: org.apache.hadoop.hbase.client.Result
Serialization stack:
- object not serializable (class: org.apache.hadoop.hbase.client.Result, value: keyvalues={
<My Data>
}); not retrying
16/02/09 11:21:21 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool
16/02/09 11:21:21 INFO DAGScheduler: Job 5 failed: foreach at LoopBacks.scala:92, took 0.103408 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0 in stage 5.0 (TID 5) had a not serializable result: org.apache.hadoop.hbase.client.Result
Serialization stack:
答案 0 :(得分:2)
我使用Spark Kyro序列化解决了这个问题。
我添加了以下代码
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
conf.registerKryoClasses(Array(classOf[org.apache.hadoop.hbase.client.Result]))
解决了这个问题。
这也是其他一些类似问题的解决方案。