我是Hadoop世界的新手。刚开始学习有关hadoop的新内容 我使用sqoop将数据从mysql导入hdfs时出现以下错误:
sqoop:000> sqoop import --connect jdbc:mysql://localhost/books --username root --password thanks --table authors --m 1;
Exception has occurred during processing command
Exception: org.codehaus.groovy.control.MultipleCompilationErrorsException Message: startup failed:
groovysh_parse: 1: expecting EOF, found 'import' @ line 1, column 7.
sqoop import --connect jdbc:mysql://localhost/books --username root --password thanks --table authors --m 1;
^
1 error
你能帮我解决这个错误吗?
答案 0 :(得分:2)
您似乎正在使用sqoop2
您需要按照以下步骤操作!
第一步 检查你是否正确安装了sqoop
sqoop:000> show version --all
你应该得到像这样的回复
Server version: Sqoop 2.0.0-SNAPSHOT revision Unknown Compiled by
jarcec on Wed Nov 21 16:15:51 PST 2012 Client version: Sqoop
2.0.0-SNAPSHOT revision Unknown Compiled by jarcec on Wed Nov 21 16:15:51 PST 2012 Protocol version: [1]
第二步
检查Sqoop服务器上可用的连接器:
sqoop:000> show connector --all
1 connector(s) to show: Connector with
id 1:
Name: generic-jdbc-connector
Class:
org.apache.sqoop.connector.jdbc.GenericJdbcConnector
Supported job
types: [EXPORT, IMPORT]
第3步
sqoop:000> create connection --cid 1
Creating connection for connector
with id 1
Please fill following values to create new connection object
Name: First connection
Configuration configuration
JDBC Driver Class: com.mysql.jdbc.Driver
JDBC Connection String: jdbc:mysql://mysql.server/database
Username: sqoop
Password: *****
JDBC Connection Properties: There are currently
0 values in the map: entry#
Security related configuration options Max connections: 0 New
connection was successfully created with validation status FINE and
persistent id 1
第4步
现在创建一个导入数据的工作
最后它还会要求提取器和加载器,使用1作为两者的值。
sqoop:000> create job --xid 1 --type import
Creating job for
connection with id 1 Please fill following values to create new job
object
Name: First job
Database configuration
Table name: users
Table SQL statement: Table
column names:
Partition column name:
Boundary query:
Output configuration
Storage type:
0 : HDFS Choose: 0
Output directory: /user/jarcec/users
New job was successfully created with
validation status FINE and persistent id 1
第5步 现在开始工作
sqoop:000> start job --jid 1
并导入您的数据
答案 1 :(得分:0)
你需要传递 - target-dir 参数HDFS path
,其中sqoop应该复制MySql记录。
尝试:
sqoop import --connect jdbc:mysql://localhost/books --username root --password thanks --table authors --target-dir /mysqlCopy --m 1;