我有Hadoop-2.2.0
,我还安装了Sqoop-1.4.5
。
查询工作正常,但我无法导入和导出HDFS数据。
执行以下命令时:
sqoop import --connect jdbc:mysql://192.168.103.104/testnotifications --username root --password frooty --table acklog --hive-import --hive-table phani -m 1 --target-dir / JobImport --fields-terminated-by' \ t' --append;
我收到了这个错误:
15/04/02 18:41:43 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/04/02 18:41:44 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
15/04/02 18:41:44 INFO tool.CodeGenTool: Beginning code generation
15/04/02 18:41:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `acklog` AS t LIMIT 1
15/04/02 18:41:44 INFO orm.CompilationManager: HADOOP_HOME is /home/mani/hadoop/hadoop-2.2.0
Note: /tmp/sqoop-mani/compile/c643c6d9bdb752fb4bc96495192a289f/acklog.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/02 18:41:45 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-mani/compile/c643c6d9bdb752fb4bc96495192a289f/acklog.java to /home/mani/./acklog.java
org.apache.commons.io.FileExistsException: Destination '/home/mani/./acklog.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:368)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:454)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
15/04/02 18:41:45 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mani/compile/c643c6d9bdb752fb4bc96495192a289f/acklog.jar
15/04/02 18:41:45 WARN manager.MySQLManager: It looks like you are importing from mysql.
15/04/02 18:41:45 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
15/04/02 18:41:45 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
15/04/02 18:41:45 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
15/04/02 18:41:45 INFO mapreduce.ImportJobBase: Beginning import of acklog
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/mani/hadoop/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/mani/hbase/hbase-1.0.0/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
OpenJDK 64-Bit Server VM warning: You have loaded library /home/mani/hadoop/hadoop-2.2.0/lib/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
15/04/02 18:41:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/04/02 18:41:46 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/04/02 18:41:46 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/04/02 18:41:46 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
15/04/02 18:41:48 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/mani/.staging/job_1427951785300_0010
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:50)
at com.cloudera.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:36)
at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:121)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:491)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:508)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:119)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:179)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:381)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:454)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57
答案 0 :(得分:0)
最后我得到了一个解决方案。
我安装了sqoop-1.4.5.bin__hadoop-2.0.4-alpha
然后它使用Hadoop-2.2.0 –
版本。