火花流hbase错误

时间:2017-04-20 13:29:54

标签: apache-spark hbase spark-streaming

我想将流数据插入hbase; 这是我的代码:

val tableName = "streamingz"
val conf = HBaseConfiguration.create()
conf.addResource(new Path("file:///opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/etc/hbase/conf.dist/hbase-site.xml"))
conf.set(TableInputFormat.INPUT_TABLE, tableName)

val admin = new HBaseAdmin(conf)
if (!admin.isTableAvailable(tableName)) {
    print("-----------------------------------------------------------------------------------------------------------")
    val tableDesc = new HTableDescriptor(tableName)
    tableDesc.addFamily(new HColumnDescriptor("z1".getBytes()))
    tableDesc.addFamily(new HColumnDescriptor("z2".getBytes()))
    admin.createTable(tableDesc)
} else {
    print("Table already exists!!--------------------------------------------------------------------------------------")
}
val ssc = new StreamingContext(sc, Seconds(10))
val topicSet = Set("fluxAstellia")
val kafkaParams = Map[String, String]("metadata.broker.list" - > "10.32.201.90:9092")
val stream = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topicSet)
val lines = stream.map(_._2).map(_.split(" ", -1)).foreachRDD(rdd => {
    if (!rdd.partitions.isEmpty) {
        val myTable = new HTable(conf, tableName)
        rdd.map(rec => {
            var put = new Put(rec._1.getBytes)
            put.add("z1".getBytes(), "name".getBytes(), Bytes.toBytes(rec._2))
            myTable.put(put)
        }).saveAsNewAPIHadoopDataset(conf)
        myTable.flushCommits()
    } else {
        println("rdd is empty")
    }

})


ssc.start()
ssc.awaitTermination()

}
}

我收到了这个错误:

:66: error: value _1 is not a member of Array[String]
       var put = new Put(rec._1.getBytes)

我是初学者,所以我怎么能解决这个错误,我有一个问题:

确切地创建表;在流媒体流程之外还是在里面?

谢谢

1 个答案:

答案 0 :(得分:0)

您的错误基本上在线var put = new Put(rec._1.getBytes) 您只能在Map上调用_n(_1表示键,_2表示值)或元组 rec是一个字符串数组,通过空格字符拆分流中的字符串。如果您在第一个元素之后,则将其写为var put = new Put(rec(0).getBytes)。同样在下一行中,您将其写为put.add("z1".getBytes(), "name".getBytes(), Bytes.toBytes(rec(1)))