我需要帮助解决Hive的数据加载问题。
背景:
我在RHEL 5.5中安装了HDFS 1.0.3和Hive 0.7.1。我能够执行所有HDFS操作。当我尝试使用hive命令行加载配置单元表时,我收到以下错误。
我尝试加载本地文件和hdfs文件。两者都给出了同样的错误。希望我错过一些配置。请找到随附的屏幕截图。
我在cloudera中测试了脚本,它运行正常。
代码:
hive> describe dept;
OK
deptid int
dname string
Time taken: 3.792 seconds
**-- simple hive table**
hive> ! cat /user/dept.txt;
Command failed with exit code = 1
cat: /user/dept.txt: No such file or directory
hive> ! hadoop fs -cat /user/dept.txt;
1,IT
2,Finance
3,Sales
**-- file is in hdfs**..
**loading file to hive table.**
hive> LOAD DATA INPATH '/users/dept.txt' overwrite into table DEPT;
FAILED: Hive Internal Error: java.lang.IllegalArgumentException(java.net.URISyntaxException: Relative path in absolute URI:
hdfs://informatica:8020$%7Bbuild.dir%7D/scratchdir/hive_2014-05-12_12-11-29_340_565872632113593986)
java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute
URI: hdfs://informatica:8020$%7Bbuild.dir%7D/scratchdir/hive_2014-05-12_12-11-29_340_565872632113593986
at org.apache.hadoop.fs.Path.initialize(Path.java:148)
at org.apache.hadoop.fs.Path.<init>(Path.java:132)
at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:142)
at org.apache.hadoop.hive.ql.Context.getExternalScratchDir(Context.java:202)
at org.apache.hadoop.hive.ql.Context.getExternalTmpFileURI(Context.java:294)
at org.apache.hadoop.hive.ql.parse.LoadSemanticAnalyzer.analyzeInternal(LoadSemanticAnalyzer.java:238)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:238)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:340)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:736)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:164)
答案 0 :(得分:0)
错误日志显示hdfs URI不正确,因此您需要检查命令中是否存在非法格式字符,或者hdfs URI是否正确。我认为命令应该如下: LOAD DATA INPATH&#39; /user/dept.txt'覆盖到表DEPT;
答案 1 :(得分:0)
你忘记了文件名中的's'吗?
hadoop fs -cat /user/dept.txt;
LOAD DATA INPATH'/ user**s**/dept.txt'覆盖到表DEPT;