我需要使用Google云笔记本中的com.databricks.spark.xml
尝试:
import os
#os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages com.databricks:spark-xml_2.11:0.6.0 pyspark-shell'
os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages com.databricks:spark-xml_2.10:0.4.1 pyspark-shell'
articles_df = spark.read.format('xml'). \
options(rootTag='articles', rowTag='article'). \
load('gs://....-20180831.xml', schema=articles_schema)
但是我得到了:
java.lang.ClassNotFoundException:无法找到数据源:xml。请在http://spark.apache.org/third-party-projects.xml
中找到软件包