我从数据帧中选择了一些列,并希望将其转换为具有指定架构的json(Geojson)数据并将其保存在数据库中 我使用Spark SQL 2.3.1 我有一个如下的架构:
root
|-- features: array (nullable = true)
| |-- element: struct (containsNull = true)
| | |-- geometry: struct (nullable = true)
| | | |-- coordinates: array (nullable = true)
| | | | |-- element: array (containsNull = true)
| | | | | |-- element: array (containsNull = true)
| | | | | | |-- element: double (containsNull = true)
| | | |-- type: string (nullable = true)
| | |-- properties: struct (nullable = true)
| | | |-- auswertezeit: string (nullable = true)
| | | |-- geschwindigkeit: long (nullable = true)
| | | |-- strecke_id: long (nullable = true)
| | | |-- verkehrsstatus: string (nullable = true)
| | |-- type: string (nullable = true)
|-- type: string (nullable = true)
和新的数据框,我想根据架构将其保存为json数据,如下所示:
val df4 = predictions.select ( "strecke_id", "geschwindigkeit", "predictedLabel").withColumnRenamed("predictedLabel", "verkehrsstatus")