插入表hivetest2 select * from hivetest:如果两个表都是事务表,则不在hive 0.14中工作

时间:2015-04-16 18:54:01

标签: hadoop hive

我尝试使用两个表hivetesthivetest2执行插入到Hive 0.14中的表选择,这两个表是事务表。当两个表都是事务表时,这不起作用。以下是我使用的查询。

我已设置以下参数

       //setting up parameters for acid transactions
        set hive.support.concurrency=true; 
        set hive.enforce.bucketing=true;
        set hive.exec.dynamic.partition.mode=nonstrict;
        set hive.txn.manager=org.apache.hadoop.hive.ql.lockmgr.DbTxnManager;
        set hive.compactor.initiator.on=true;

        set hive.compactor.worker.threads=2;

//creating first transaction table    
        create table hivetest(key int,value String,Department String) clustered by (department) into 3 buckets stored as orc TBLPROPERTIES 
        ('transactional'='true') ;

        //creating second transaction table
        create table hivetest2(key int,value String,Department String) clustered by (department) into 3 buckets stored as orc TBLPROPERTIES 
        ('transactional'='true');
        //inserting data into table hivetest
        insert into table hivetest values (1,'jon','ABC'), (2,'rec','EFG');

        Finally, when I executed the below insert query,
        //executing insert overwrite command
        insert into table hivetest2 select * from hivetest ;

        I am getting the following exception,

        Query ID = A567812_20150416131818_1a260b18-f699-4b0a-ae66-94e07fcfa710
        Total jobs = 1
        Launching Job 1 out of 1
        Number of reduce tasks is set to 0 since there's no reduce operator
        java.lang.RuntimeException: serious problem
                at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$Context.waitForTasks(OrcInputFormat.java:478)
                at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.generateSplitsInfo(OrcInputFormat.java:949)
                at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.getSplits(OrcInputFormat.java:974)
                at org.apache.hadoop.hive.ql.io.BucketizedHiveInputFormat.getSplits(BucketizedHiveInputFormat.java:148)
                at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:624)
                at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:616)
                at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
                at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
                at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1293)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:415)
                at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
                at org.apache.hadoop.mapreduce.Job.submit(Job.java:1293)
                at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
                at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:415)
                at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
                at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
                at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
                at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:429)
                at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:137)
                at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
                at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
                at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
                at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
                at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
                at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
                at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
                at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
                at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
                at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
                at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)
                at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
                at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:606)
                at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
                at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
        Caused by: java.lang.IllegalArgumentException: delta_0000352_0000352 does not start with base_
                at org.apache.hadoop.hive.ql.io.AcidUtils.parseBase(AcidUtils.java:136)
                at org.apache.hadoop.hive.ql.io.AcidUtils.parseBaseBucketFilename(AcidUtils.java:164)
                at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$FileGenerator.run(OrcInputFormat.java:544)
                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
                at java.lang.Thread.run(Thread.java:744)
        Job Submission failed with exception 'java.lang.RuntimeException(serious problem)'
        FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

    Please help me to find a solution for this problem.I know that for bucketted table there should be a base_ file. But it is not created when inserted data in my table.

0 个答案:

没有答案