我正在尝试将表从MySQL导入到Hive。但是,我收到以下错误,您能为此提供解决方案吗?
SqoopOptions loading .....
导入工具运行....
14/03/18 06:48:34 WARN sqoop.ConnFactory:$ SQOOP_CONF_DIR尚未在环境中设置。无法检查其他配置。
14/03/18 06:48:43 INFO mapred.JobClient:SPLIT_RAW_BYTES = 87
14/03/18 06:48:43 INFO mapred.JobClient:地图输出记录= 2
14/03/18 06:48:43 INFO mapreduce.ImportJobBase:在5.5688秒内传输18个字节(3.2323字节/秒)
14/03/18 06:48:43 INFO mapreduce.ImportJobBase:检索了2条记录。
14/03/18 06:48:43 INFO manager.SqlManager:执行SQL语句:SELECT t。* FROM student AS t WHERE 1 = 0
14/03/18 06:48:43 INFO manager.SqlManager:执行SQL语句:SELECT t。* FROM student AS t WHERE 1 = 0
14/03/18 06:48:43 INFO hive.HiveImport:将上传的数据加载到Hive中
警告:不推荐使用org.apache.hadoop.metrics.jvm.EventCounter。请在所有log4j.properties文件中使用org.apache.hadoop.log.metrics.EventCounter。
使用jar中的配置初始化日志:文件:/home/master/apps/hive-0.10.0/lib/hive-common-0.10.0.jar!/hive-log4j.properties
Hive历史文件= / tmp / master / hive_job_log_master_201403180648_1860851359.txt
FAILED:元数据错误:MetaException(消息:文件:/ user / hive / warehouse / student不是目录或无法创建目录)
失败:执行错误,从org.apache.hadoop.hive.ql.exec.DDLTask返回代码1
失败!!!
我写的代码:
public class SqoopJavaInterface {
private static final String JOB_NAME = "Sqoop Hive Job";
private static final String MAPREDUCE_JOB = "Hive Map Reduce Job";
private static final String DBURL = "jdbc:mysql://localhost:3306/test";
private static final String DRIVER = "com.mysql.jdbc.Driver";
private static final String USERNAME = "root";
private static final String PASSWORD = "root";
private static final String HADOOP_HOME = "/home/master/apps/hadoop-1.0.4";
private static final String JAR_OUTPUT_DIR = "/home/master/data";
private static final String HIVE_HOME = "/home/master/apps/hive-0.10.0";
private static final String HIVE_DIR = "/user/hive/warehouse/";
private static final String WAREHOUSE_DIR = "hdfs://localhost:9000/user/hive/warehouse/student";
private static final String SUCCESS = "SUCCESS !!!";
private static final String FAIL = "FAIL !!!";
/**
* @param table
* @throws IOException
*/
public static void importToHive(String table) throws IOException {
System.out.println("SqoopOptions loading .....");
Configuration config = new Configuration();
// Hive connection parameters
config.addResource(new Path(HADOOP_HOME+"/conf/core-site.xml"));
config.addResource(new Path(HADOOP_HOME+"/conf/hdfs-site.xml"));
config.addResource(new Path(HIVE_HOME+"/conf/hive-site.xml"));
FileSystem dfs =FileSystem.get(config);
/* MySQL connection parameters */
SqoopOptions options = new SqoopOptions(config);
options.setConnectString(DBURL);
options.setTableName(table);
options.setDriverClassName(DRIVER);
options.setUsername(USERNAME);
options.setPassword(PASSWORD);
options.setHadoopMapRedHome(HADOOP_HOME);
options.setHiveHome(HIVE_HOME);
options.setHiveImport(true);
options.setHiveTableName(table);
options.setOverwriteHiveTable(true);
options.setFailIfHiveTableExists(false);
options.setFieldsTerminatedBy(',');
options.setOverwriteHiveTable(true);
options.setDirectMode(true);
options.setNumMappers(1); // No. of Mappers to be launched for the job
options.setWarehouseDir(WAREHOUSE_DIR);
options.setJobName(JOB_NAME);
options.setMapreduceJobName(MAPREDUCE_JOB);
options.setTableName(table);
options.setJarOutputDir(JAR_OUTPUT_DIR);
System.out.println("Import Tool running ....");
ImportTool it = new ImportTool();
int retVal = it.run(options);
if (retVal == 0) {
System.out.println(SUCCESS);
} else {
System.out.println(FAIL);
}
}
当我执行上面的代码时,我收到以下错误。你可以为此提供解决方案吗?
Execution failed while executing command: 192.168.10.172
Error message: bash: 192.168.10.172: command not found
Now wait 5 seconds to begin next task ...
Connection channel disconnect
net.neoremind.sshxcute.core.Result@60c2be20
Command is sqoop import --connect jdbc:mysql://localhost:3316/hadoop --username root --password root --table employees --hive-import -m 1 -- --schema default
Connection channel established succesfully
Start to run command
Connection channel closed
Check if exec success or not ...
Execution failed while executing command: sqoop import --connect jdbc:mysql://localhost:3316/hadoop --username root --password root --table employees --hive-import -m 1 -- --schema default
Error message: bash: sqoop: command not found
Now wait 5 seconds to begin next task ...
Connection channel disconnect
SSH connection shutdown
答案 0 :(得分:0)
由于不推荐使用sqoop选项方法,您可以使用以下代码:
public static void importToHive() throws Exception{
Configuration config = new Configuration();
config.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));
config.addResource(new Path("/usr/local/hadoop/conf/hdfs-site.xml"));
String[] cmd ={"import", "--connect",<connectionString>,"--username", userName,
"--password", password,"--hadoop-home", "/usr/local/hadoop","--table",<tableName>, "--hive-import","--create-hive-table", "--hive-table",<tableName>,"-target-dir",
"hdfs://localhost:54310/user/hive/warehouse","-m", "1","--delete-target-dir"};
Sqoop.runTool(cmd,config);
}
请为mysql使用正确的hadoop和hive仓库path
,username
,password
。请从core-site.xml
检查您的端口(在我的情况下是54310)