oozie sqoop操作无法导入

时间:2016-12-04 18:55:19

标签: hadoop hive sqoop oozie bigdata

我在执行oozie sqoop操作时遇到问题。在日志中我可以看到sqoop能够将数据导入临时目录,然后sqoop创建hive脚本来导入数据。

将数据导入配置单元时失败。

以下是我正在使用的sqoop操作。

<action name="import" retry-max="2" retry-interval="5">
    <sqoop xmlns="uri:oozie:sqoop-action:0.2">
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>
      <configuration>
        <property>
          <name>mapred.job.queue.name</name>
          <value>${jobQueue}</value>
        </property>
      </configuration>
      <arg>import</arg>
      <arg>-D</arg>
      <arg>sqoop.mapred.auto.progress.max=300000</arg>
      <arg>-D</arg>
      <arg>map.retry.exponentialBackOff=TRUE</arg>
      <arg>-D</arg>
      <arg>map.retry.numRetries=3</arg>
      <arg>--options-file</arg>
      <arg>${odsparamFileName}</arg>
      <arg>--table</arg>
      <arg>${odsTableName}</arg>
      <arg>--where</arg>
      <arg>${ods_data_pull_column} BETWEEN TO_DATE(${wf:actionData('getDates')['prevMonthBegin']},'YYYY-MM-DD hh24:mi:ss') AND TO_DATE(${wf:actionData('prevMonthEnd')['endDate']},'YYYY-MM-DD hh24:mi:ss')</arg>
      <arg>--hive-import</arg>
      <arg>--hive-overwrite</arg>
      <arg>--hive-table</arg>
      <arg>${stgTable}</arg>
      <arg>--hive-drop-import-delims</arg>
      <arg>--warehouse-dir</arg>
      <arg>${sqoopStgDir}</arg>
      <arg>--delete-target-dir</arg>
      <arg>--null-string</arg>
      <arg>\\N</arg>
      <arg>--null-non-string</arg>
      <arg>\\N</arg>
      <arg>--compress</arg>
      <arg>--compression-codec</arg>
      <arg>gzip</arg>
      <arg>--num-mappers</arg>
      <arg>1</arg>
      <arg>--verbose</arg>
      <file>${odsSqoopConnectionParamsFileLocation}</file>
    </sqoop>
    <ok to="rev"/>
    <error to="fail"/>
  </action>

以下是我在mapred日志中获得的错误

20078 [main] DEBUG org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat  - Creating input split with lower bound '1=1' and upper bound '1=1'
Heart beat
Heart beat
Heart beat
Heart beat
151160 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 0 bytes in 135.345 seconds (0 bytes/sec)
151164 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 0 records.
151164 [main] ERROR org.apache.sqoop.tool.ImportTool  - Error during import: Import job failed!
Intercepting System.exit(1)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]

Oozie Launcher failed, finishing Hadoop job gracefully

请建议

1 个答案:

答案 0 :(得分:0)

您可以使用--target-dir将表导入hdfs路径,并将hive表的位置设置为指向该路径。我用这种方法修复了它。希望它也能帮到你。