我们正在尝试将数据从Oracle DB导入Hive,但是遇到了CLASSNOTFOUND异常。
Hadoop的版本是:
$ hadoop version
Hadoop 2.5.1
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 2e18d179e4a8065b6a9f29cf2de9451891265cce
Compiled by jenkins on 2014-09-05T23:11Z
Compiled with protoc 2.5.0
From source with checksum 6424fcab95bfff8337780a181ad7c78
This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-2.5.1.jar
Scoop的版本是:
$ ./sqoop-version
Warning: /usr/local/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/02/17 16:10:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5
Sqoop 1.4.5
git commit id 5b34accaca7de251fc91161733f906af2eddbe83
Compiled by abe on Fri Aug 1 11:19:26 PDT 2014
类路径已设置
$ echo $CLASSPATH
:/home/huser/data-integration/lib:/home/huser/data-integration/plugins/pentaho-big-data-plugin:/home/huser/data-integration/plugins/pentaho-big-data-plugin/lib:/home/huser/data-integration/plugins/pentaho-big-data-plugin/hadoop-configurations/hadoop-20:/home/huser/data-integration/plugins/pentaho-big-data-plugin/hadoop-configurations/hadoop-20/lib:/home/huser/data-integration/plugins/pentaho-big-data-plugin/hadoop-configurations/hadoop-20/lib/client:/usr/local/sqoop/lib:/usr/local/sqoop/lib
SQOOP命令:
sqoop import -libjars=/usr/local/sqoop/lib/ojdbc7.jar --connect jdbc:oracle:thin:@SOME_IP:1521:orcl --table PRODUCT_DETAILS --target-dir /tmp/hive-huser/test --split-by <coulmn-name> --username <username> --password <password> --verbose --bindir "/usr/local/sqoop/class/"
我们收到的错误是:
15/02/17 15:11:18 WARN mapred.LocalJobRunner: job_local1789263485_0001
java.lang.Exception: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class PRODUCT_DETAILS not found
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class PRODUCT_DETAILS not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1905)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getInputClass(DBConfiguration.java:403)
at org.apache.sqoop.manager.oracle.OraOopDataDrivenDBInputFormat.createDBRecordReader(OraOopDataDrivenDBInputFormat.java:187)
at org.apache.sqoop.mapreduce.db.DBInputFormat.createRecordReader(DBInputFormat.java:263)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:492)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:735)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: Class PRODUCT_DETAILS not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1811)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1903)
... 12 more
15/02/17 15:11:19 INFO mapreduce.Job: Job job_local1789263485_0001 failed with state FAILED due to: NA
15/02/17 15:11:19 INFO mapreduce.Job: Counters: 0
15/02/17 15:11:19 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/02/17 15:11:19 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 7.0605 seconds (0 bytes/sec)
15/02/17 15:11:19 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/02/17 15:11:19 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/02/17 15:11:19 ERROR tool.ImportTool: Error during import: Import job failed!
非常感谢任何见解。
如果有任何更多信息,请告诉我。
谢谢, PRAS
答案 0 :(得分:1)
检查jdbc驱动程序是否在$ SQOOP_HOME / lib下。您需要JDBC驱动程序存在于sqoop lib目录中。 Sqoop将从此位置下的jar文件加载类以执行MR作业。