你能不能让我知道newAPIHadoopRDD的Maven依赖

时间:2016-02-17 14:17:04

标签: apache-spark hbase apache-spark-sql spark-streaming

你能不能让我知道newAPIHadoopRDD的Maven依赖。我的代码是:

MavenProject

提前感谢您的帮助。

1 个答案:

答案 0 :(得分:0)

您正在使用TextInputFormat。它必须是TableInputFormat。 你使用过的conf,是SparkConf还是org.apache.hadoop.conf.Configuration。?

Maven Dependencies。

    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-client</artifactId>
        <version>${hbase.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-common</artifactId>
        <version>${hbase.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-server</artifactId>
        <version>${hbase.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-hadoop-compat</artifactId>
        <version>${hbase.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-protocol</artifactId>
        <version>${hbase.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-hadoop2-compat</artifactId>
        <version>${hbase.version}</version>
    </dependency>
    <!-- HBase dependencies Start -->

代码:

Configuration hbaseConf = HBaseConfiguration.create();
hbaseConf.set("hbase.zookeeper.quorum", "localhost");
hbaseConf.set(TableInputFormat.INPUT_TABLE, "table1");

JavaPairRDD<ImmutableBytesWritable, Result> routerRDD = sc.newAPIHadoopRDD(hbaseConf, TableInputFormat.class, ImmutableBytesWritable.class, Result.class);