在spark sql

时间:2018-02-22 14:24:47

标签: sql pyspark apache-spark-sql pyspark-sql sql-limit

我使用DESCRIBE关键字获取有关临时视图的列信息。这是一个有用的方法,但我有一个表,我只想描述列的一个子集。我正在尝试将LIMITDESCRIBE结合使用来实现这一目标,但无法弄明白。

这是一个玩具数据集(使用pyspark创建):

# make some test data
columns = ['id', 'dogs', 'cats', 'horses', 'people']
vals = [
     (1, 2, 0, 4, 3),
     (2, 0, 1, 2, 4)
]

# create DataFrame
df = spark.createDataFrame(vals, columns)
df.createOrReplaceTempView('df')

现在用sql描述:

%%sql

DESCRIBE df

输出:

col_name    data_type
id          bigint
dogs        bigint
cats        bigint
horses      bigint
people      bigint

实际上我有比这更多的列,我想要做的是LIMIT此查询的输出。以下是我尝试过的几件事:

尝试#1:

DESCRIBE df
LIMIT 3

错误:

An error was encountered:
"\nextraneous input '3' expecting {<EOF>, '.'}(line 3, pos 6)\n\n== SQL ==\n\nDESCRIBE df\nLIMIT 3 \n------^^^\n"
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 603, in sql
    return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
  File "/usr/lib/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 73, in deco
    raise ParseException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.ParseException: "\nextraneous input '3' expecting {<EOF>, '.'}(line 3, pos 6)\n\n== SQL ==\n\nDESCRIBE df\nLIMIT 3 \n------^^^\n"

尝试#2:

SELECT a.*
FROM (
    DESCRIBE df
) AS a
LIMIT 3

错误:

An error was encountered:
'Table or view not found: DESCRIBE; line 4 pos 4'
Traceback (most recent call last):
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 603, in sql
    return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
  File "/usr/lib/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 69, in deco
    raise AnalysisException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.AnalysisException: 'Table or view not found: DESCRIBE; line 4 pos 4'

有谁知道是否可以限制describe的输出?

1 个答案:

答案 0 :(得分:3)

以下是使用DESCRIBE限制pyspark.sql.dataframe.limit()输出的方法。使用pyspark.sql.context.sql()运行DESCRIBE查询。这会将结果作为DataFrame返回,您可以调用limit()

df.registerTempTable('df')
spark.sql('DESCRIBE df').limit(3).show()
#+--------+---------+-------+
#|col_name|data_type|comment|
#+--------+---------+-------+
#|      id|   bigint|   null|
#|    dogs|   bigint|   null|
#|    cats|   bigint|   null|
#+--------+---------+-------+

但是,如果您只是在查找列的数据类型,则可以使用DataFrame的dtypes属性:

df.dtypes
#[('id', 'bigint'),
# ('dogs', 'bigint'),
# ('cats', 'bigint'),
# ('horses', 'bigint'),
# ('people', 'bigint')]

这是一个元组列表,可以根据你的需要进行切片:

df.dtypes[0:3]
#[('id', 'bigint'), ('dogs', 'bigint'), ('cats', 'bigint')]

DataFrames还有一个describe()方法,可以返回摘要统计信息:

df.describe().show()
#+-------+------------------+------------------+------------------+------------------+------------------+
#|summary|                id|              dogs|              cats|            horses|            people|
#+-------+------------------+------------------+------------------+------------------+------------------+
#|  count|                 2|                 2|                 2|                 2|                 2|
#|   mean|               1.5|               1.0|               0.5|               3.0|               3.5|
#| stddev|0.7071067811865476|1.4142135623730951|0.7071067811865476|1.4142135623730951|0.7071067811865476|
#|    min|                 1|                 0|                 0|                 2|                 3|
#|    max|                 2|                 2|                 1|                 4|                 4|
#+-------+------------------+------------------+------------------+------------------+------------------+

如果您想限制列,可以使用select()并指定一片df.columns

df.select(df.columns[0:3]).describe().show()
#+-------+------------------+------------------+------------------+
#|summary|                id|              dogs|              cats|
#+-------+------------------+------------------+------------------+
#|  count|                 2|                 2|                 2|
#|   mean|               1.5|               1.0|               0.5|
#| stddev|0.7071067811865476|1.4142135623730951|0.7071067811865476|
#|    min|                 1|                 0|                 0|
#|    max|                 2|                 2|                 1|
#+-------+------------------+------------------+------------------+