无法使用sqoop导入数据

时间:2017-11-01 07:47:38

标签: mysql hadoop hive sqoop resourcemanager

我想使用sqoop将数据从MySQL导入远程Hive。我在中间件机器上安装了Sqoop。当我运行此命令时:

sqoop import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://192.168.2.146:3306/fir --username root -P -m 1 --table beard_size_list --connect jdbc:hive2://192.168.2.141:10000/efir --username oracle -P -m 1 --hive-table lnd_beard_size_list --hive-import;

这个命令是否正确可以将数据从远程MySQL导入远程Hive吗?

当我运行此命令时,它继续尝试连接到资源管理器:

17/11/01 10:54:05 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.1.0-129
Enter password: 
17/11/01 10:54:10 INFO tool.BaseSqoopTool: Using Hive-specific delimiters 
for output. You can override
17/11/01 10:54:10 INFO tool.BaseSqoopTool: delimiters with --fields-
terminated-by, etc.
17/11/01 10:54:10 WARN sqoop.ConnFactory: Parameter --driver is set to an 
explicit driver however appropriate connection manager is not being set (via 
--connection-manager). Sqoop is going to fall back to 
org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which 
connection manager should be used next time.
17/11/01 10:54:10 INFO manager.SqlManager: Using default fetchSize of 1000
17/11/01 10:54:10 INFO tool.CodeGenTool: Beginning code generation
17/11/01 10:54:11 INFO manager.SqlManager: Executing SQL statement: SELECT 
t.* FROM beard_size_list AS t WHERE 1=0
17/11/01 10:54:11 INFO manager.SqlManager: Executing SQL statement: SELECT 
t.* FROM beard_size_list AS t WHERE 1=0
17/11/01 10:54:11 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is 
/usr/hdp/2.6.1.0-129/hadoop-mapreduce
Note: /tmp/sqoop-
oracle/compile/d93080265a09913fbfe9e06e92d314a3/beard_size_list.java uses or 
overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/11/01 10:54:15 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-
oracle/compile/d93080265a09913fbfe9e06e92d314a3/beard_size_list.jar
17/11/01 10:54:15 INFO mapreduce.ImportJobBase: Beginning import of 
beard_size_list
17/11/01 10:54:15 INFO Configuration.deprecation: mapred.jar is deprecated. 
Instead, use mapreduce.job.jar
17/11/01 10:54:15 INFO manager.SqlManager: Executing SQL statement: SELECT 
t.* FROM beard_size_list AS t WHERE 1=0
17/11/01 10:54:17 INFO Configuration.deprecation: mapred.map.tasks is 
deprecated. Instead, use mapreduce.job.maps
17/11/01 10:54:17 INFO client.RMProxy: Connecting to ResourceManager at 
hortonworksn2.com/192.168.2.191:8050
17/11/01 10:54:17 INFO client.AHSProxy: Connecting to Application History 
server at hortonworksn2.com/192.168.2.191:10200
17/11/01 10:54:19 INFO ipc.Client: Retrying connect to server: 
hortonworksn2.com/192.168.2.191:8050. Already tried 0 time(s); retry policy 
is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 
MILLISECONDS)
17/11/01 10:54:20 INFO ipc.Client: Retrying connect to server: 
hortonworksn2.com/192.168.2.191:8050. Already tried 1 time(s); retry policy 
is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 
MILLISECONDS)
17/11/01 10:54:21 INFO ipc.Client: Retrying connect to server: 
hortonworksn2.com/192.168.2.191:8050. Already tried 2 time(s); retry policy 
is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 
MILLISECONDS)
17/11/01 10:54:22 INFO ipc.Client: Retrying connect to server: 
hortonworksn2.com/192.168.2.191:8050. Already tried 3 time(s); retry policy 
is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 
MILLISECONDS)
17/11/01 10:54:23 INFO ipc.Client: Retrying connect to server: 
hortonworksn2.com/192.168.2.191:8050. Already tried 4 time(s); retry policy 
is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 
MILLISECONDS)

它尝试连接的端口是8050,但实际端口是8033.我该如何解决这个问题?

3 个答案:

答案 0 :(得分:1)

尝试以下命令:

sqoop import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://192.168.2.146:3306 / fir --username root -P -m 1 --table beard_size_list;

答案 1 :(得分:0)

请检查以下属性是否已正确设置为yarn-site.xml

<name>yarn.resourcemanager.address</name>
<value>192.168.2.191:8033</value>

答案 2 :(得分:0)

为什么在代码中添加了两次-connect语句?试试以下代码:

sqoop import --driver com.mysql.jdbc.Driver --connect jdbc:mysql://192.168.2.146:3306/fir --username root -P -m 1 --split-by beard_size_list_table_primary_key --table beard_size_list --target-dir /user/data/raw/beard_size_list --fields-terminated-by "," --hive-import --create-hive-table --hive-table dbschema.beard_size_list

注意:

create-hive-table - 确定如果Hive表已存在,设置作业是否会失败。在这种情况下它可以工作,你可以创建hive外部表并设置 target-dir 路径