Sqoop Merge:找不到类名

时间:2017-06-03 15:53:17

标签: sqoop

错误如下所示我在进行Sqoop合并时遇到类名未找到错误。我采取了jar文件gettting创建的路径。但仍然得到错误。

1.Sqoop import命令和下面的jar文件创建路径..

**sqoop import --connect jdbc:mysql://ip-172-31-13-154:3306/sqoopex --username sqoopuser --password NHkkP876rp --query "select * from departments where \$CONDITIONS" --target-dir /user/sakthimuruganv2214/sakthi/sqoop_import/departments_new --split-by department_id**
17/06/03 15:10:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.4.0-3485
17/06/03 15:10:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/06/03 15:10:51 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.17/06/03 15:10:51 INFO tool.CodeGenTool: Beginning code generation
17/06/03 15:10:51 INFO manager.SqlManager: Executing SQL statement: select * from departments where  (1 = 0)
17/06/03 15:10:51 INFO manager.SqlManager: Executing SQL statement: select * from departments where  (1 = 0)
17/06/03 15:10:51 INFO manager.SqlManager: Executing SQL statement: select * from departments where  (1 = 0)
17/06/03 15:10:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.4.0-3485/hadoop-mapreduce
Note: /tmp/sqoop-sakthimuruganv2214/compile/eb54f9171acb69d7044867fba5396b7a/QueryResult.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/06/03 15:10:53 INFO orm.CompilationManager: **Writing jar file: /tmp/sqoop-sakthimuruganv2214/compile/eb54f9171acb69d7044867fba5396b7a/QueryResult.jar**
17/06/03 15:10:53 INFO mapreduce.ImportJobBase: Beginning query import.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See htt://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/06/03 15:10:54 INFO impl.TimelineClientImpl: Timeline service address: htt://ip-172-31-13-154.ec2.internal:8188/ws/v1/timeline/
17/06/03 15:10:54 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-53-48.ec2.internal/172.31.53.48:8050
17/06/03 15:10:57 INFO db.DBInputFormat: Using read commited transaction isolation
17/06/03 15:10:57 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(department_id), MAX(department_id) FROM (select * from departments where  (1 = 1) ) AS t1
17/06/03 15:10:57 INFO mapreduce.JobSubmitter: number of splits:4
17/06/03 15:10:57 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1495368194937_3155
17/06/03 15:10:57 INFO impl.YarnClientImpl: Submitted application application_1495368194937_3155
17/06/03 15:10:57 INFO mapreduce.Job: The url to track the job: htt://a.cloudxlab.com:8088/proxy/application_1495368194937_3155/
17/06/03 15:10:57 INFO mapreduce.Job: Running job: job_1495368194937_3155
17/06/03 15:11:05 INFO mapreduce.Job: Job job_1495368194937_3155 running in uber mode : false
17/06/03 15:11:05 INFO mapreduce.Job:  map 0% reduce 0%
17/06/03 15:11:10 INFO mapreduce.Job:  map 100% reduce 0%
17/06/03 15:11:11 INFO mapreduce.Job: Job job_1495368194937_3155 completed successfully
17/06/03 15:11:11 INFO mapreduce.Job: Counters: 30
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=597796
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=486
                HDFS: Number of bytes written=305
                HDFS: Number of read operations=16
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=8
        Job Counters
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=36480
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=12160
                Total vcore-seconds taken by all map tasks=12160
                Total megabyte-seconds taken by all map tasks=18677760
        Map-Reduce Framework
                Map input records=25
                Map output records=25
                Input split bytes=486
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=204
                CPU time spent (ms)=4940
                Physical memory (bytes) snapshot=878874624
                Virtual memory (bytes) snapshot=12966318080
                Total committed heap usage (bytes)=726138880
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=305
17/06/03 15:11:11 INFO mapreduce.ImportJobBase: Transferred 305 bytes in 16.9768 seconds (17.9657 bytes/sec)
17/06/03 15:11:11 INFO mapreduce.ImportJobBase: Retrieved 25 records.

2.Sqoop merge命令和错误

**sqoop merge --merge-key department_id --new-data /user/sakthimuruganv2214/sakthi/sqoop_import/departments_new --onto /user/sakthimuruganv2214/sakthi/sqoop_import/departments_update --target-dir /user/sakthimuruganv2214/sakthi/sqoop_import/departments_merge/ --class-name departments --jar-file  /tmp/sqoop-sakthimuruganv2214/compile/eb54f9171acb69d7044867fba5396b7a/QueryResult.jar**

17/06/03 15:16:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.4.0-3485
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.3.4.0-3485/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/06/03 15:16:07 INFO impl.TimelineClientImpl: Timeline service address: htt://ip-172-31-13-154.ec2.internal:8188/ws/v1/timeline/
17/06/03 15:16:07 INFO client.RMProxy: Connecting to ResourceManager at ip-172-31-53-48.ec2.internal/172.31.53.48:8050
17/06/03 15:16:09 INFO input.FileInputFormat: Total input paths to process : 5
17/06/03 15:16:09 INFO mapreduce.JobSubmitter: number of splits:5
17/06/03 15:16:09 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1495368194937_3156
17/06/03 15:16:10 INFO impl.YarnClientImpl: Submitted application application_1495368194937_3156
17/06/03 15:16:10 INFO mapreduce.Job: The url to track the job: http://a.cloudxlab.com:8088/proxy/application_1495368194937_3156/
17/06/03 15:16:10 INFO mapreduce.Job: Running job: job_1495368194937_3156
17/06/03 15:16:16 INFO mapreduce.Job: Job job_1495368194937_3156 running in uber mode : false
17/06/03 15:16:16 INFO mapreduce.Job:  map 0% reduce 0%
17/06/03 15:16:20 INFO mapreduce.Job: Task Id : attempt_1495368194937_3156_m_000002_0, Status : FAILED
**Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class departments not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)at org.apache.sqoop.mapreduce.MergeTextMapper.setup(MergeTextMapper.java:42)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
 at **

1 个答案:

答案 0 :(得分:2)

由于您在Sqoop导入期间未指定--class-name,因此它已创建默认类名QueryResult

在Sqoop merge命令

中将--class-name departments替换为--class-name QueryResult