无法删除错误创建的Hive表

时间:2016-04-05 15:09:21

标签: hadoop hive hdfs

在使用Hive时,我意外地提交了一个CREATE TABLE,如下所示:

hive> create external table hivetable(hiveid INT, hivename string) location '/';

创建的表格,现在我无法将其删除 - 如果我从表格中选择*,我会得到大量的乱码数据 - 例如:

NULL    �rg.4ӣ�7�-�j��>�9Wϙ�U�m�eO�0���Un=7=W
NULL    NULL
NULL    NULL
NULL    NULL
NULL    NULL
NULL    �s�2�+�q��GVZ�ur�*Y�M�s���G��R0
NULL    c��ㇷ'o~8����e]�ay>ѫjCE5
NULL    �S�&�M�I��S���"��p�
NULL    NULL
NULL    G2I|=i
NULL    �Q?y����:R�T���QA6X�A+pI}�C?\��0Ek[�h�v�6�G�onය��^�;s4��
NULL    �Og�oߟ���d|'���h�j7!��E;�o��    ���G���\�\��p��tm��Gwx��e����c�
                                                                       nñ*8�����U�&|���B��ց�6��%��Ŵ⮶��)H�obN�3�?ܵ�$N�T��$2��p�$�f��hU��ڶ��Lg�!԰x)۔2/���RҭN�D����L�[��ٻ��˳�Ň���<roi�W������jB���ס1L0&���g��0�}��TV`c�.����Q�9n���*�D ����r"~0]5�
NULL    NULL
NULL    NULL
NULL    NULL
NULL    NULL
NULL    ����_u�:g1^��}T�U�'����
                               ����b���Ӳ�U9�fC�1�nI��H��
                                                        ����(

我想我已经创建了一个试图读取整个HDFS文件系统目录结构的表 - 但是我无法在HDFS中找到该表,尽管我可以在默认的蜂巢表,我不能放弃表。截至目前,我还没有注意到任何影响,但我想删除表记录并确保清理所有内容。有什么想法吗?

谢谢, 亚瑟

编辑:

尝试删除表时出错:

    > drop table hivetable;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.lang.NullPointerException)
hive> 

我自从情况恶化后:

hive> LOAD DATA LOCAL INPATH '/root/arthurs_stuff/empty.txt' OVERWRITE INTO TABLE hivetable;
Loading data to table default.hivetable
Moved: 'hdfs://sandbox.hortonworks.com:8020/app-logs' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/apps' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/ats' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/demo' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/flume' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/hdp' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/mapred' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/mr-history' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/ranger' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/root' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/spark-history' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Moved: 'hdfs://sandbox.hortonworks.com:8020/tmp' to trash at: hdfs://sandbox.hortonworks.com:8020/user/root/.Trash/Current
Table default.hivetable stats: [numFiles=856, numRows=0, totalSize=574803295, rawDataSize=0]
OK

接下来是:

hadoop fs -mv /user/root/.Trash/Current/* /

幸运的是,这只是一个用于研究的沙箱虚拟机:)

也许我应该看看重新格式化HDFS还是什么?

0 个答案:

没有答案