NoSuchMethodError HTableDescriptor.addFamily

时间:2015-06-24 17:01:05

标签: exception hadoop hbase

我已经安装了hadoop 2.5.2和hbase 1.0.1.1(彼此兼容)。但是在hadoop代码中我试图在hbase表中添加columnfamily。

我的代码是

Configuration hbaseConfiguration =  HBaseConfiguration.create();
Job hbaseImportJob = new Job(hbaseConfiguration, "FileToHBase");


HBaseAdmin hbaseAdmin = new HBaseAdmin(hbaseConfiguration);

if (!hbaseAdmin.tableExists(Config_values.tableName)) {        
    TableName tableName1 = TableName.valueOf("tableName");
    HTableDescriptor hTableDescriptor = new HTableDescriptor(tableName1);
    HColumnDescriptor hColumnDescriptor1 = new HColumnDescriptor("columnFamily1");                                 
    hTableDescriptor.addFamily(hColumnDescriptor1);                                   
    hbaseAdmin.createTable(hTableDescriptor);
}

我收到此错误

  

java.lang.NoSuchMethodError:线程中的异常" main"   java.lang.NoSuchMethodError:   org.apache.hadoop.hbase.HTableDescriptor.addFamily(Lorg /阿帕奇/ hadoop的/ HBase的/ HColumnDescriptor;)V     在com.atm_ex.atm_ex.Profiles.profiles(Profiles.java:177)at   com.atm_ex.atm_ex.App.main(App.java:28)at   sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)     在   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     在java.lang.reflect.Method.invoke(Method.java:606)at   org.apache.hadoop.util.RunJar.main(RunJar.java:212)

3 个答案:

答案 0 :(得分:1)

出于安全原因,您应该使用相同版本的hbase来编译和运行jar。当你运行mvn clean package -DskipTests = true时,请确保你的hbase pom依赖与你的cdh hbase匹配,而不是版本,但是它包含的方法相同,cdh可能不会遵循apache orign。也许你可以尝试在它的网站上使用支持cdh的pom(maven存储库)。

    <name>c-cdh-maven-dep</name>
<!---  you need try both  -->
    <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    <!---  i have tried this and it works well -->
    <!-- <url>http://maven.apache.org</url> -->

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>


    <!-- <repositories> <repository> <id>cloudera</id> <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url> 
        </repository> </repositories> -->

    <dependencies>
        <!-- <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> 
            <version>3.8.1</version> <scope>test</scope> </dependency> -->

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency>

    <!--    <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency> -->
<!--        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-maven-plugins</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency> -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.2.0-cdh5.7.0</version>
        </dependency>
<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.2.0</version>
        </dependency> -->


        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-hadoop2-compat</artifactId>
            <version>1.2.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>

        <!-- hadoop dependency start -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.0</version>
        </dependency>
        <!-- Hadoop dep end -->

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <!-- spark dep end -->

        <dependency>
            <groupId>org.clojure</groupId>
            <artifactId>clojure</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>11.0.2</version>
        </dependency>

        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>2.5.0</version>
        </dependency>
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
            <version>3.6.6.Final</version>
        </dependency>

        <dependency>
            <groupId>org.apache.zookeeper</groupId>
            <artifactId>zookeeper</artifactId>
            <version>3.4.5</version>
        </dependency>
        <dependency>
            <groupId>org.cloudera.htrace</groupId>
            <artifactId>htrace-core</artifactId>
            <version>2.01</version>
        </dependency>


    <!--    <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase</artifactId>
            <version>2.0.0-SNAPSHOT</version>
            <type>pom</type>
        </dependency> -->



        <!-- hbase dep start -->
        <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->
<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase</artifactId>
            <version>1.2.0</version>
            <type>pom</type>
        </dependency> -->



<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-common</artifactId>
            <version>1.0.0</version>
        </dependency> -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-server</artifactId>
            <version>1.2.0</version>
        </dependency>

答案 1 :(得分:0)

HBase相关的jar文件必须包含在MapReduce作业中。查看this Cloudera博客了解更多详情。

答案 2 :(得分:0)

我也遇到过这样的错误!但在另一种情况下:我使用hbase-0.98编译我的jar,尽管我使用hbase-1.0.1.1运行它。

最后我发现在hbase-0.98中,方法签名是

,但是在hbase-1.0.1.1中它是

有不同的!

出于安全原因,您应该使用相同版本的hbase来编译和运行jar。