使用Sqoop将数据从mysql导入HDFS

时间:2017-06-15 00:36:36

标签: mysql hadoop sqoop

我正在使用Hadoop-1.2.1和Sqoop-1.4.6。我使用sqoop使用以下命令将数据库test中的表meshtree导入HDFS:

`sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test`

但是,它显示了这个错误:

17/06/17 18:15:21 WARN tool.BaseSqoopTool: Setting your password on the     command-line is insecure. Consider using -P instead.
17/06/17 18:15:21 INFO manager.MySQLManager: Preparing to use a MySQL     streaming resultset.
17/06/17 18:15:21 INFO tool.CodeGenTool: Beginning code generation
17/06/17 18:15:22 INFO manager.SqlManager: Executing SQL statement: SELECT     t.* FROM `test` AS t LIMIT 1
17/06/17 18:15:22 INFO orm.CompilationManager: HADOOP_HOME is /home/student    /Installations/hadoop-1.2.1/libexec/..
Note: /tmp/sqoop-student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java     uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/06/17 18:15:24 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-    student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.java to /home/student    /Installations/hadoop-1.2.1/./test.java
org.apache.commons.io.FileExistsException: Destination '/home/student    /Installations/hadoop-1.2.1/./test.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at     org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:367)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
17/06/17 18:15:24 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-    student/compile/6bab6efaa3dc60e67a50885b26c1d14b/test.jar
17/06/17 18:15:24 WARN manager.MySQLManager: It looks like you are importing     from mysql.
17/06/17 18:15:24 WARN manager.MySQLManager: This transfer can be faster! Use     the --direct
17/06/17 18:15:24 WARN manager.MySQLManager: option to exercise a MySQL-    specific fast path.
17/06/17 18:15:24 INFO manager.MySQLManager: Setting zero DATETIME behavior     to convertToNull (mysql)
17/06/17 18:15:24 INFO mapreduce.ImportJobBase: Beginning import of test
17/06/17 18:15:27 INFO mapred.JobClient: Cleaning up the staging area     hdfs://localhost:9000/home/student/Installations/hadoop-1.2.1/data/mapred    /staging/student/.staging/job_201706171814_0001
17/06/17 18:15:27 ERROR security.UserGroupInformation:     PriviledgedActionException as:student     cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory     test already exists
17/06/17 18:15:27 ERROR tool.ImportTool: Encountered IOException running     import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output     directory test already exists
at     org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileO    utputFormat.java:137)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:973)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at     org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at     org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:550)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:580)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
at     org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:201)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:413)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:97)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:380)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:453)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)

有没有办法找出这个问题?

3 个答案:

答案 0 :(得分:1)

如果您打算将Sqoop与分布式Hadoop集群一起使用,请务必不要使用URL localhost。您提供的连接字符串将用于整个MapReduce集群中的TaskTracker节点;如果指定文字名localhost,则每个节点将连接到不同的数据库(或者更可能是根本没有数据库)。 相反,您应该使用所有远程节点都可以看到的数据库主机的完整主机名或IP地址

请访问Sqoop文档Connecting to a Database Server部分以获取更多信息。

答案 1 :(得分:1)

你没有权限。所以请联系myql dba给你相同的权限。 或者,如果您拥有对mysql的管理员访问权限,您可以自己动手。

grant all privileges on databasename.* to 'username'@'%' identified by 'password';

* - 适用于所有表格 % - 允许来自任何主机

上面的语法是在mysql server中为用户授予权限。在你的情况下,它将是: -

grant all privileges on meshtree.test to 'root'@'localhost' identified by 'yourpassword';

答案 2 :(得分:0)

  • 您正在导入而不提供hdfs的目标目录。当我们没有提供任何目标目录时,sqoop只运行一次导入,而使用你的mysql表名在hdfs中创建目录。
  

所以您的查询

sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test

  

这将在hdfs

中创建名为test1的目录
  • 只需添加以下脚本
  • 即可

sqoop import --connect jdbc:mysql://localhost/meshtree --username user --password password --table test --target-dir test1

希望完全正常工作,并参考sqoop import and all related sqoop