引起:java.lang.ClassNotFoundException:org.apache.hadoop.hbase.HBaseConfiguration

时间:2017-10-10 10:16:21

标签: scala hadoop apache-spark hue

我正在尝试在Hbase上创建一个表(在指定的集群上),我尝试以下代码:

import org.apache.hadoop.hbase.client.{HTable, Put, HBaseAdmin}
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor, HColumnDescriptor}

object ImportData {
   var cf = "d"
  def createTable (TableName : String, NameSpace : String , reset : Boolean): HTable =  {

    // initialize configuration and admin
    val Hbaseconfig  = HBaseConfiguration.create()
    val hbaseAdmin = new HBaseAdmin (Hbaseconfig)

    // check if table exsit )
    if (!hbaseAdmin.isTableAvailable(NameSpace + ":" + TableName) || (reset)) {
         if (hbaseAdmin.isTableAvailable(NameSpace + ":" + TableName) && (reset)) { //force delete table
        hbaseAdmin.disableTable(NameSpace + ":" + TableName)
        hbaseAdmin.deleteTable(NameSpace + ":" + TableName)
      }

         val tableDesc = new HTableDescriptor ((NameSpace + ":" + TableName).getBytes())
         val tableFamily = new HColumnDescriptor(cf)

         // Adding column families to table descriptor
         tableDesc.addFamily(tableFamily)
         // create table
         hbaseAdmin.createTable(tableDesc)
    }


    Hbaseconfig.set(TableInputFormat.INPUT_TABLE,  NameSpace + ":" + TableName)
    val Table = new HTable(Hbaseconfig, NameSpace + ":" + TableName)
    println (">>> Table " + Table + "created on Hbase")
    return Table
  }
  // put data in table
  def writetotable(table : HTable, columnname : List[String],value : List[String]){
    val Lsize = columnname.size-1
    var p = new Put(Bytes.toBytes("row1")); 
    for ( i <- 0 to Lsize){
       p.add(Bytes.toBytes(cf), Bytes.toBytes(columnname(i)),Bytes.toBytes(value(i)));
    }

   table.put (p); 
   table.close()

  }

}

我在HUE服务器上使用spark运行它,但我有以下错误:

17/10/10 12:04:34 ERROR ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
**java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration**
    at com.renault.fic_histo.parsing.ImportData$.createTable(ImportData.scala:13)
    at com.renault.fic_histo.parsing.Global_Main.save_fic_histo(Global_Main.scala:32)
    at com.renault.fic_histo.parsing.Global_Main$.main(Global_Main.scala:47)
    at com.renault.fic_histo.parsing.Global_Main.main(Global_Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)
**Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration**
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

我读到我必须使用以下代码在hadoop-env.sh文件中添加hbase类路径:

$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\
    $HBASE_HOME/hbase-0.94.22-test.jar:\
    $HBASE_HOME/conf:\
    ${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\
    ${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\
    ${HBASE_HOME}/lib/guava-11.0.2.jar

以下是我的问题: 1.我没有在本地运行它,所以我无法更改此配置。我该怎么做才能解决这个问题?
2.我应该在我的代码中连接或指定hbase集群吗?有人能给我一个很好的教程吗?

谢谢

1 个答案:

答案 0 :(得分:0)

hbase-common.jar包含org.apache.hadoop.hbase.HBaseConfiguration的类定义。因此,您需要在hbase-common.jar中加入HADOOP_CLASSPATH