无法使用CSV文件中的Presto创建Hive表

时间:2019-06-17 10:32:23

标签: sql csv amazon-s3 hive presto

我想使用Presto创建一个Hive表,并将数据存储在S3的csv文件中。

我已将文件上传到S3,并且我确信Presto能够连接到存储桶。

现在,当我发出create table命令时,查询表时所有值(行)都为NULL。

我尝试研究类似的问题,但事实证明Presto在Stackoverflow上并不那么出名。

文件中的某些行是:

PassengerId,Survived,Pclass,Name,Sex,Age,SibSp,Parch,Ticket,Fare,Cabin,Embarked
1,0,3,"Braund, Mr. Owen Harris",male,22,1,0,A/5 21171,7.25,,S
2,1,1,"Cumings, Mrs. John Bradley (Florence Briggs Thayer)",female,38,1,0,PC 17599,71.2833,C85,C
3,1,3,"Heikkinen, Miss. Laina",female,26,0,0,STON/O2. 3101282,7.925,,S
4,1,1,"Futrelle, Mrs. Jacques Heath (Lily May Peel)",female,35,1,0,113803,53.1,C123,S
5,0,3,"Allen, Mr. William Henry",male,35,0,0,373450,8.05,,S
6,0,3,"Moran, Mr. James",male,,0,0,330877,8.4583,,Q
7,0,1,"McCarthy, Mr. Timothy J",male,54,0,0,17463,51.8625,E46,S
8,0,3,"Palsson, Master. Gosta Leonard",male,2,3,1,349909,21.075,,S
9,1,3,"Johnson, Mrs. Oscar W (Elisabeth Vilhelmina Berg)",female,27,0,2,347742,11.1333,,S
10,1,2,"Nasser, Mrs. Nicholas (Adele Achem)",female,14,1,0,237736,30.0708,,C
11,1,3,"Sandstrom, Miss. Marguerite Rut",female,4,1,1,PP 9549,16.7,G6,S
12,1,1,"Bonnell, Miss. Elizabeth",female,58,0,0,113783,26.55,C103,S
13,0,3,"Saundercock, Mr. William Henry",male,20,0,0,A/5. 2151,8.05,,S
14,0,3,"Andersson, Mr. Anders Johan",male,39,1,5,347082,31.275,,S
15,0,3,"Vestrom, Miss. Hulda Amanda Adolfina",female,14,0,0,350406,7.8542,,S
16,1,2,"Hewlett, Mrs. (Mary D Kingcome) ",female,55,0,0,248706,16,,S
17,0,3,"Rice, Master. Eugene",male,2,4,1,382652,29.125,,Q
18,1,2,"Williams, Mr. Charles Eugene",male,,0,0,244373,13,,S
19,0,3,"Vander Planke, Mrs. Julius (Emelia Maria Vandemoortele)",female,31,1,0,345763,18,,S
20,1,3,"Masselmani, Mrs. Fatima",female,,0,0,2649,7.225,,C

我的csv文件为here,请从此处获取train.csv。因此,我的presto命令是:

create table testing_nan_4 ( PassengerId integer, Survived integer, Pclass integer, Name varchar, Sex varchar, Age integer, SibSp integer, Parch integer, Ticket integer, Fare double, Cabin varchar, Embarked varchar ) with ( external_location = 's3://my_bucket/titanic_train/', format = 'textfile' );

结果是:

 passengerid | survived | pclass | name | sex  | age  | sibsp | parch | ticket | fare | cabin | embarked
-------------+----------+--------+------+------+------+-------+-------+--------+------+-------+----------
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL
 NULL        | NULL     | NULL   | NULL | NULL | NULL | NULL  | NULL  | NULL   | NULL | NULL  | NULL

并且期望得到实际数据。

2 个答案:

答案 0 :(得分:1)

当前,以文本文件格式,您必须提供一个0x1分隔('\ u0001')文件才能正确读取。问题是Presto在这里不支持自定义定界符。

https://github.com/prestodb/presto/issues/10905

建议在此处使用Hive DDL,并在Presto中轻松阅读。

以下是Hive查询:

CREATE EXTERNAL TABLE mytable ( 
   PassengerId int, Survived int, Pclass int, Name string, Sex string, Age int, SibSp int, Parch int, Ticket int, Fare double, Cabin string, Embarked string 
)

ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
WITH SERDEPROPERTIES (
  'separatorChar' = ',',
  'quoteChar' = '\"',
  'escapeChar' = '\\'
)
STORED AS TEXTFILE
LOCATION 's3://bucket-path/csv_data/'
TBLPROPERTIES (
  "skip.header.line.count"="1")

答案 1 :(得分:1)

Starburst Presto当前支持CSV Hive存储格式,请参阅:https://docs.starburstdata.com/latest/release/release-302-e.html?highlight=csv

还有使其在PrestoSQL中运行的工作,请参见:https://github.com/prestosql/presto/pull/920

然后您可以在Presto Hive连接器中使用以下表格:

CREATE TABLE hive.default.csv_table_with_custom_parameters (
    c_bigint varchar,
    c_varchar varchar)
WITH (
    csv_escape = '',
    csv_quote = '',  
    csv_separator = U&'\0001', -- to pass unicode character
    external_location = 'hdfs://hadoop/datacsv_table_with_custom_parameters',
    format = 'CSV')

在您的情况下,它将是:

CREATE TABLE hive.default.csv_table_with_custom_parameters (
       PassengerId int, Survived int, Pclass int, Name string, Sex string, Age int, SibSp int, Parch int, Ticket int, Fare double, Cabin string, Embarked string)
WITH (
    csv_escape = '\',
    csv_quote = '"',  
    csv_separator = ',',
    external_location = 's3://my_bucket/titanic_train/',
    format = 'CSV')

请注意,csv_escapecsv_quotecsv_separator表属性仅支持单个字符值。

对于"skip.header.line.count"="1",对于CSV表,在Presto中还没有等效语法。因此,我建议您从数据文件中删除标题。