当我运行下面的sqoop导入作业时,它的工作完全正常。
sqoop import -libjars ${JARS} --driver ${DRIVER}
--connect ${URL} -m 1 --hive-overwrite --hive-import
--hive-database ${Database} --hive-table Table
--target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND \$CONDITIONS"
当我尝试为同一个导入创建sqoop作业时,它抱怨解析参数时出错
创建sqoop作业
sqoop job --create SomeJobName -- import -libjars ${JARS}
--driver ${DRIVER} --connect ${URL} -m 1
--hive-overwrite --hive-import
--hive-database ${Database}
--hive-table Table --target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND \$CONDITIONS"
以下是我遇到的错误:
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/02/15 10:55:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.1
18/02/15 10:55:57 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
18/02/15 10:55:57 ERROR tool.BaseSqoopTool: Unrecognized argument: -libjars
答案 0 :(得分:0)
我能够通过在泛型参数中添加-libjars
参数来解决问题。通过查看sqoop doc
sqoop job (generic-args) (job-args) [-- [subtool-name] (subtool-args)]
来了解
sqoop job -libjars /var/lib/sqoop/some.jar,/var/lib/sqoop/some.jar--create SomeJobName -- import
--driver ${DRIVER} --connect ${URL} -m 1
--hive-overwrite --hive-import
--hive-database ${Database}
--hive-table Table --target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND \$CONDITIONS"