无法使用sqoop2将数据从Oracle导入到Accumulo

时间:2015-07-27 06:14:07

标签: hadoop sqoop accumulo sqoop2

我正在尝试使用以下命令将数据从Oracle导入到Accumulo。

sqoop import --connect jdbc:oracle:thin:hr/hr1234@bhucloud05.ad.abcsoftware.com:1521 --username hr --password hr1234 --accumulo-user kaar --accumulo-password password --accumulo-instance bhucloud05.ad.abcsoftware.com --accumulo-zookeepers bhucloud05.ad.abcsoftware.com:2181 --table employi --accumulo-table employi  --accumulo-column-family col1 --columns eid,ename,comp --accumulo-row-key eid --accumulo-create-table

But I am getting the below error

find: paths must precede expression: Compression.jar
Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression]
15/07/27 11:26:08 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.2.0
15/07/27 11:26:08 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/07/27 11:26:08 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/07/27 11:26:08 INFO manager.SqlManager: Using default fetchSize of 1000
15/07/27 11:26:08 INFO tool.CodeGenTool: Beginning code generation
15/07/27 11:26:09 INFO manager.OracleManager: Time zone has been set to GMT
15/07/27 11:26:09 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM employi t WHERE 1=0
15/07/27 11:26:09 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /hadoop/CDH_5.1.2_Linux_parcel/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-hadoop/compile/c4c0bba34136199e18ce69f0e0ae9428/employi.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/07/27 11:26:11 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/c4c0bba34136199e18ce69f0e0ae9428/employi.jar
15/07/27 11:26:11 ERROR tool.ImportTool: Error during import: Accumulo jars are not present in classpath, cannot import to Accumulo!

如果我遗漏任何配置,你能帮我吗?

2 个答案:

答案 0 :(得分:0)

错误显示" Accumulo jar在classpath中不存在,无法导入到Accumulo!"

尝试添加Hadoop类路径的jdbc jar。 export HADOOP_CLASSPATH = $ HADOOP_CLASSPATH :( jdbc jar的位置)

答案 1 :(得分:0)

Make sure that you have ACCUMULO_HOME set correctly as an environment variable or the Java system variable accumulo.home.

Taking a guess, the Accumulo jars were never added to distributed cache. It looks like this should only happen if you don't provide the location where Accumulo is installed or you provide an incorrect location. All jar files located a ${ACCUMULO_HOME}/lib should be added to the classpath automatically.