我遇到了问题。 我创建了一个带有错误HDFS路径的外部配置单元表,然后我在HDFS中填充了数据现在我正在尝试删除表并获得以下错误
18/02/15 08:35:02 [HiveServer2-Background-Pool: Thread-54]: ]: ERROR exec.DDLTask: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.lang.IllegalArgumentException: Wrong FS: hdfs://abc:8020/usr/log, expected: hdfs://abc3/usr/log at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1084)
at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:1015)
at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:4013)
at org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3869)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:339)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1745)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1491)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1289)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1156)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1151)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:197)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)Caused by: MetaException(message:java.lang.IllegalArgumentException: Wrong FS
任何人都可以建议我们如何放弃桌子?
答案 0 :(得分:0)
基本上,您需要更新您创建的HDFS路径的元数据。
从Hive Metastore计算机运行此程序。
例如:
/ usr / hdp / current / hive / bin / metatool -updateLocation hdfs:// nameserviceID / external hdfs://namenode.fqdn:8020 / external
验证,运行:
/ usr / hdp / version-number / hive / bin / metatool -listFSRoot
确保列出了新路径。
这里还有其他一些解决方案: java.lang.IllegalArgumentException: Wrong FS: , expected: hdfs://localhost:9000