我创建了一个oozie sqoop任务,用于将数据从mysql导入到hive。我有一个namenode和3个datanode,它们还在namenode上安装了hive,oozie和sqoop。
sqoop import coommand已在namenode上测试了var CLI,但每次我创建一个oozie sqoop任务时,它也会失败。以下是详细错误。
2017-08-11 11:27:40,787 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 0% reduce 0%
2017-08-11 11:27:40,787 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 0% reduce 0%
2017-08-11 11:27:44,833 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 25% reduce 0%
2017-08-11 11:27:44,833 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 25% reduce 0%
2017-08-11 11:27:45,837 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 75% reduce 0%
2017-08-11 11:27:45,837 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 75% reduce 0%
2017-08-11 11:27:46,844 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 100% reduce 0%
2017-08-11 11:27:46,844 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - map 100% reduce 0%
2017-08-11 11:27:46,856 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - Job job_1502360348741_0041 completed successfully
2017-08-11 11:27:46,856 [uber-SubtaskRunner] INFO org.apache.hadoop.mapreduce.Job - Job job_1502360348741_0041 completed successfully
...
2017-08-11 11:27:46,932 [uber-SubtaskRunner] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 625 bytes in 12.0595 seconds (51.8263 bytes/sec)
2017-08-11 11:27:46,936 [uber-SubtaskRunner] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 14 records.
2017-08-11 11:27:46,951 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM UserRole AS t WHERE 1=0
2017-08-11 11:27:46,952 [uber-SubtaskRunner] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM UserRole AS t WHERE 1=0
2017-08-11 11:27:46,953 [uber-SubtaskRunner] WARN org.apache.sqoop.hive.TableDefWriter - Column updatedDate had to be cast to a less precise type in Hive
2017-08-11 11:27:46,960 [uber-SubtaskRunner] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
2017-08-11 11:27:46,963 [uber-SubtaskRunner] ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Cannot run program "hive": error=2,
这是我的想法
我应该在群集的每个datanode上安装配置单元吗?或者我可以做任何配置来解决这个问题吗?
答案 0 :(得分:0)