将JavaObject`scala.collection.Map <object,rdd <?>>`转换为python字典

时间:2019-05-03 16:57:56

标签: java python scala apache-spark pyspark

在pyspark中,调用getPersistentRDDs()的Java sparkContext方法将返回scala.collection.Map<Object,RDD<?>>的JavaObject实例。

from pyspark.sql import SparkSession
from pyspark import StorageLevel

spark = SparkSession.builder.master('yarn').getOrCreate()
sc = spark.sparkContext

df = spark.range(0, 25000000, 1)
df.persist(StorageLevel.MEMORY_ONLY)
df.limit(1).count()

sc._jsc.sc().getPersistentRDDs()

返回JavaObject id=o477

如何将scala.collection.Map<Object,RDD<?>>的JavaObject转换为python字典?

1 个答案:

答案 0 :(得分:2)

from pyspark import RDD

scala_map = sc._jsc.sc().getPersistentRDDs()
py_dict = {e._1(): RDD(e._2().toJavaRDD(), sc) for e in [scala_map.toList().apply(i) for i in range(scala_map.size())]}