pyspark:使用JavaObject StructType

时间:2017-03-17 17:30:34

标签: apache-spark pyspark spark-dataframe

我需要解析JSON schema文件以创建pyspark.sql.types.StructType。我找到了scala library可以为我做这件事。所以我这样称呼它:

f = open('path/to/schema.json')
js = f.read()
conv = dspark.sparkContext._jvm.org.zalando.spark.jsonschema.SchemaConverter
schema = conv.convertContent(js)

但是当我尝试使用它来构建这样的DataFrame时:

spark.read.format("json").schema(schema)

我收到以下错误:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/Cellar/apache-spark/2.1.0/libexec/python/pyspark/sql/readwriter.py", line 103, in schema
    raise TypeError("schema should be StructType")
TypeError: schema should be StructType

如果我打印类型:

print type(schema)

我明白了:

<class 'py4j.java_gateway.JavaObject'>

如何将值包装为python StructType

1 个答案:

答案 0 :(得分:2)

在pyspark源代码中挖掘后,我查看了DataFrame.schema的实现:

@property
@since(1.3)
def schema(self):
    if self._schema is None:
        try:
            self._schema = _parse_datatype_json_string(self._jdf.schema().json())
        except AttributeError as e:
            raise Exception(
                "Unable to parse datatype from schema. %s" % e)
    return self._schema

方法_parse_datatype_json_stringpyspark.sql.types中定义,因此可行:

from pyspark.sql.types import _parse_datatype_json_string

conv = self.spark.sparkContext._jvm.org.zalando.spark.jsonschema.SchemaConverter
jschema = conv.convertContent(read_schema)
schema = _parse_datatype_json_string(jschema.json())
src = src.schema(schema)

现在我打电话的时候:

print type(schema)

我明白了:

<class 'pyspark.sql.types.StructType'>