spark数据帧插入到hbase错误

时间:2017-07-10 14:48:57

标签: apache-spark dataframe hbase apache-spark-sql

我有一个包含此架构的数据框:

 |-- Name1: string (nullable = true)
 |-- Name2: string (nullable = true)
 |-- App: string (nullable = true)
...
 |-- Duration: float (nullable = false)

我想将它插入到hbase表中。我使用了很多references。 我定义了目录:

def catalog = s"""{
       |"table":{"namespace":"default", "name":"otarie"},
       |"rowkey":"key",
       |"columns":{
         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
         |"col1":{"cf":"cf1", "col":"Name1", "type":"boolean"},
         |"col2":{"cf":"cf2", "col":"Name2", "type":"double"},
         |"col3":{"cf":"cf3", "col":"App", "type":"float"},
            ........
         |"co27":{"cf":"cf27", "col":"Duration", "type":"string"}
       |}
     |}""".stripMargin

而不是我试着写我的数据帧:

Append_Ot.write.options(Map(HBaseTableCatalog.tableCatalog -> catalog, HBaseTableCatalog.newTable -> "5")).format("org.apache.hadoop.hbase.spark ").save()

我正在使用spark-shell,我收到了这个错误:

 <console>:155: error: not found: value HBaseTableCatalog

0 个答案:

没有答案