使用sqoop从mysql尝试导入表

时间:2016-05-22 16:27:23

标签: mysql hadoop import hdfs sqoop

这里我试图使用sqoop从mysql导入数据库。当我尝试时,我得到低于错误。任何人都可以帮助我

[root@sandbox ~]# sqoop import-all-tables -m 12 --connect     "jdbc:mysql://sandbox.hortonworks.com:3306/retail_db" --username=retail_dba -P --             as-avrodatafile --warehouse-dir=/apps/hive/warehouse/retail_stage.db
Warning: /usr/hdp/2.4.0.0-169/accumulo does not exist! Accumulo imports will    fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/05/22 16:12:48 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.4.0.0-169
Enter password:
16/05/22 16:12:52 INFO manager.MySQLManager: Preparing to use a MySQL     streaming resultset.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.0.0-169/hadoop/lib/slf4j-     log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.0.0-169/zookeeper/lib/slf4j-    log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an     explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
16/05/22 16:12:53 INFO tool.CodeGenTool: Beginning code generation
16/05/22 16:12:53 INFO manager.SqlManager: Executing SQL statement: SELECT     t.* FROM `categories` AS t LIMIT 1
16/05/22 16:12:53 INFO manager.SqlManager: Executing SQL statement: SELECT     t.* FROM `categories` AS t LIMIT 1
16/05/22 16:12:53 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is     /usr/hdp/2.4.0.0-169/hadoop-mapreduce
Note: /tmp/sqoop-    root/compile/9a872a9731ab3cb2920a1910153051ff/categories.java uses or overrides a     deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/05/22 16:12:58 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-    root/compile/9a872a9731ab3cb2920a1910153051ff/categories.jar
16/05/22 16:12:58 WARN manager.MySQLManager: It looks like you are importing     from mysql.
16/05/22 16:12:58 WARN manager.MySQLManager: This transfer can be faster! Use     the --direct
16/05/22 16:12:58 WARN manager.MySQLManager: option to exercise a MySQL-    specific fast path.
16/05/22 16:12:58 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/05/22 16:12:58 INFO mapreduce.ImportJobBase: Beginning import of     categories
16/05/22 16:13:01 INFO manager.SqlManager: Executing SQL statement: SELECT     t.* FROM `categories` AS t LIMIT 1
16/05/22 16:13:01 INFO mapreduce.DataDrivenImportJob: Writing Avro schema      file: /tmp/sqoop-root/compile/9a872a9731ab3cb2920a1910153051ff/ca             tegories.avsc
16/05/22 16:13:02 INFO impl.TimelineClientImpl: Timeline service address:     http://sandbox.hortonworks.com:8188/ws/v1/timeline/
16/05/22 16:13:02 INFO client.RMProxy: Connecting to ResourceManager at      sandbox.hortonworks.com/10.0.0.4:8050
16/05/22 16:13:04 INFO ipc.Client: Retrying connect to server:     sandbox.hortonworks.com/10.0.0.4:8050. Already tried 0 time(s); retry policy              is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000     MILLISECONDS)
16/05/22 16:13:05 INFO ipc.Client: Retrying connect to server:     sandbox.hortonworks.com/10.0.0.4:8050. Already tried 1 time(s); retry policy              is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000     MILLISECONDS)
16/05/22 16:13:06 INFO ipc.Client: Retrying connect to server:     sandbox.hortonworks.com/10.0.0.4:8050. Already tried 2 time(s); retry policy              is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000     MILLISECONDS)
16/05/22 16:13:07 INFO ipc.Client: Retrying connect to server:     sandbox.hortonworks.com/10.0.0.4:8050. Already tried 3 time(s); retry policy              is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000     MILLISECONDS)

1 个答案:

答案 0 :(得分:0)

看起来NameNode存在一些问题。

请检查$ HADOOP_HOME / logs / namenode.log中的namenode日志

检查hadoop是否在http://localhost:50070