我使用的是hadoop2.6.0,hive1.2.1和sqoop 1.4.6。将emp表从mysql导入到hive时,它将表数据复制到/ user / hive / warehouse / \ temp,但是它会抛出以下错误,
我正在使用命令:
sqoop import --connect jdbc:mysql://localhost/test --username root --password hadoop --table emp -m 1 --hive-import --warehouse-dir /user/hive/warehouse/
[main] mapreduce.Job: Running job: job_1473826602802_0002
2016-09-13 22:32:20,943 INFO [main] mapreduce.Job: Job job_1473826602802_0002 running in uber mode : false
2016-09-13 22:32:21,005 INFO [main] mapreduce.Job: map 0% reduce 0%
2016-09-13 22:32:59,525 INFO [main] mapreduce.Job: map 100% reduce 0%
2016-09-13 22:33:01,727 INFO [main] mapreduce.Job: Job job_1473826602802_0002 completed successfully
**2016-09-13 22:33:03,190 ERROR [main] tool.ImportTool: Imported Failed: No enum constant org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS**
如何解决此错误。是否需要this.suggest我进行任何设置。