SparkHadoopWriter在UserProvider上使用NPE失败

时间:2018-05-24 03:43:15

标签: apache-spark hbase

我正在使用Spark将数据写入Hbase,我可以很好地读取数据,但写入失败并出现以下异常。我发现通过添加* site.xml和hbase JAR解决了类似的问题。但这对我不起作用。我试图从表中读取数据将数据写入另一个表。我可以很好地读取数据,但在写作时会出现异常。

     JavaPairRDD<ImmutableBytesWritable, Put>  tablePuts = hBaseRDD.mapToPair(new PairFunction<Tuple2<ImmutableBytesWritable, Result>, ImmutableBytesWritable, Put>() {      
            @Override
            public Tuple2<ImmutableBytesWritable, Put> call(Tuple2<ImmutableBytesWritable, Result> results) throws Exception {
                        byte[] accountId = results._2().getValue(Bytes.toBytes(COLFAMILY), Bytes.toBytes("accountId"));                       
                        String rowKey = new String(results._2().getRow();
String accountId2 = (Bytes.toString(accountId));
                        String prefix = getMd5Hash(rowKey);
                        String newrowKey = prefix + rowKey; 
                        Put put = new Put( Bytes.toBytes(newrowKey) );
                        put.addColumn(Bytes.toBytes("def"), Bytes.toBytes("accountId"), accountId);

                    }
                });
    Job newAPIJobConfiguration = Job.getInstance(conf);
        newAPIJobConfiguration.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, OUT_TABLE_NAME);
        newAPIJobConfiguration.setOutputFormatClass(org.apache.hadoop.hbase.mapreduce.TableOutputFormat.class);
        newAPIJobConfiguration.setOutputKeyClass(org.apache.hadoop.hbase.io.ImmutableBytesWritable.class);
        newAPIJobConfiguration.setOutputValueClass(org.apache.hadoop.io.Writable.class);
        tablePuts.saveAsNewAPIHadoopDataset(newAPIJobConfiguration.getConfiguration());

线程“main”java.lang.NullPointerException中的异常         在org.apache.hadoop.hbase.security.UserProvider.instantiate(UserProvider.java:123)         在org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:214)         在org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)         at org.apache.hadoop.hbase.mapreduce.TableOutputFormat.checkOutputSpecs(TableOutputFormat.java:177)         在org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil.assertConf(SparkHadoopWriter.scala:387)         在org.apache.spark.internal.io.SparkHadoopWriter $ .write(SparkHadoopWriter.scala:71)         在org.apache.spark.rdd.PairRDDFunctions $$ anonfun $ saveAsNewAPIHadoopDataset $ 1.apply $ mcV $ sp(PairRDDFunctions.scala:1083)         在org.apache.spark.rdd.PairRDDFunctions $$ anonfun $ saveAsNewAPIHadoopDataset $ 1.apply(PairRDDFunctions.scala:1081)         在org.apache.spark.rdd.PairRDDFunctions $$ anonfun $ saveAsNewAPIHadoopDataset $ 1.apply(PairRDDFunctions.scala:1081)         在org.apache.spark.rdd.RDDOperationScope $ .withScope(RDDOperationScope.scala:151)         在org.apache.spark.rdd.RDDOperationScope $ .withScope(RDDOperationScope.scala:112)         在org.apache.spark.rdd.RDD.withScope(RDD.scala:363)         在org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1081)         在org.apache.spark.api.java.JavaPairRDD.saveAsNewAPIHadoopDataset(JavaPairRDD.scala:831)         在com.voicebase.etl.s3tohbase.HbaseScan2.main(HbaseScan2.java:148)         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)         at java.lang.reflect.Method.invoke(Method.java:498)         在org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)         在org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:879)         在org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:197)         在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:227)         在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:136)         在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

0 个答案:

没有答案