尝试连接

时间:2017-02-03 18:52:31

标签: hadoop sqoop

我正在尝试运行以下Sqoop命令:

sqoop import --connect jdbc:mysql://localhost:3306/sunil_sqoop --table sqoop_emp --username root  --password 225dvrdlr)

但是,我收到此错误:

  

17/02/04 00:04:53 WARN security.UserGroupInformation:PriviledgedActionException as:avinash(auth:SIMPLE)cause:java.io.FileNotFoundException:文件不存在:hdfs:// localhost:9000 / home /阿维纳什/ sqoop-1.4.6.bin__hadoop-2.0.4-α/ LIB / SLF4J-API-1.6.1.jar   17/02/04 00:04:53错误工具.ImportTool:遇到IOException正在运行导入作业:java.io.FileNotFoundException:文件不存在:hdfs:// localhost:9000 / home / avinash / sqoop-1.4.6。 bin__hadoop-2.0.4-α/ LIB / SLF4J-API-1.6.1.jar       在org.apache.hadoop.hdfs.DistributedFileSystem $ 17.doCall(DistributedFileSystem.java:1093)       在org.apache.hadoop.hdfs.DistributedFileSystem $ 17.doCall(DistributedFileSystem.java:1085)       在org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)       在org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)       at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)       在org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:267)       at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:388)       at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:481)       在org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1295)       在org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1292)       at java.security.AccessController.doPrivileged(Native Method)       在javax.security.auth.Subject.doAs(Subject.java:415)       at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)       在org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)       在org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)       在org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)       在org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)       在org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)       在org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)       在org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)       在org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)       在org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)       在org.apache.sqoop.Sqoop.run(Sqoop.java:143)       在org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)       在org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)       在org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)       在org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)       在org.apache.sqoop.Sqoop.main(Sqoop.java:236)

我该怎么做。

1 个答案:

答案 0 :(得分:0)

错误:

File does not exist: hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-1.6.1.jar 

您应该将文件slf4j-api-1.6.1.jar复制到HDFS中的目录:

home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/. 

或者您可以将此jar复制到Oozie sharelib。