我一直在尝试使用oozie通过sqoop将数据从mysql表导入hbase
我的sqoop导入命令:
sqoop import --connect jdbc:mysql://some_ip/myDatabase
--username User --password-file /etc/User.password
--hbase-create-table --table mytbl
--columns "column1,column2,column3,column4"
--hbase-table mynamespace:mytbl --column-family default
--hbase-row-key column1 -m 1 --incremental lastmodified
--check-column column4
当通过命令行运行时,它就像魅力一样。但是,当我尝试从oozie工作流程运行此操作时,如下所示:
<action name="mytbl">
<sqoop xmlns="uri:oozie:sqoop-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<command>import --connect jdbc:mysql://some_ip/myDatabase
--username User --password-file /etc/User.password
--hbase-create-table --table mytbl
--columns "column1,column2,column3,column4"
--hbase-table mynamespace:mytbl --column-family default
--hbase-row-key column1 -m 1 --incremental lastmodified
--check-column column4
</command>
<file>/oozie-lib/hbase-client-1.1.3.jar#hbase-client-1.1.3.jar</file>
<file>/oozie-lib/hbase-protocol-1.1.3.jar#hbase-protocol-1.1.3.jar</file>
<file>/oozie-lib/hbase-server-1.1.3.jar#hbase-server-1.1.3.jar</file>
<file>/oozie-lib/hbase-hadoop2-compat-1.1.3.jar#hbase-hadoop2-compat-1.1.3.jar</file>
<file>/oozie-lib/hbase-hadoop-compat-1.1.3.jar#hbase-hadoop-compat-1.1.3.jar</file>
<file>/oozie-lib/mysql-connector-java-5.1.40-bin.jar#mysql-connector-java-5.1.40-bin.jar</file>
</sqoop>
<ok to="End"/>
<error to="notify"/>
</action>
罐子&#39;所提到的链接在hdfs中得到很好的补充,因此&#34;错误到了&#34;行动通知。
执行oozie工作流程时,我遇到了以下异常。
[main] ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can’t get the locations
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:312)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:155)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:59)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:821)
at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:392)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:402)
at org.apache.sqoop.mapreduce.HBaseImportJob.jobSetup(HBaseImportJob.java:217)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:271)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
我尝试将用户更改为&#39; hdfs&#39;使用&#39; -doas&#39;执行oozie作业参数,但没有运气。
非常感谢任何帮助。感谢。
编辑::我尝试添加
<job-xml>${hbaseXml}</job-xml>
有正确的hbase-site.xml路径,但后来开始得到以下异常..
ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Thu May 25 18:10:43 IST 2017, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68220: row 'mynamespace:mytbl,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=myJobTracker,60020,1495464595462, seqNum=0
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:286)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:231)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:61)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:295)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:160)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:155)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867)
at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:410)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:420)
at org.apache.sqoop.mapreduce.HBaseImportJob.jobSetup(HBaseImportJob.java:217)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:271)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:507)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:193)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:176)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:56)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:231)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1714)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68220: row 'mynamespace:mytbl,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=myJobtracker,60020,1495464595462, seqNum=0
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:329)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:402)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:203)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:64)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:381)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:355)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 4 more
Caused by: com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:394)
... 10 more
Caused by: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:225)
... 13 more
Caused by: java.lang.ClassNotFoundException: com.yammer.metrics.core.Gauge
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 14 more
我尝试将metrics-core-3.0.2.jar添加到其他jar列表中。仍然得到同样的例外。