IndexError:在pyspark shell上使用reduceByKey操作时,列表索引超出范围

时间:2019-04-15 03:36:18

标签: apache-spark pyspark

目标:从youtube数据集中找到最高的视频类别

使用:Pyspark外壳

预期:类别及其出现的次数

实际:使用reduceBykey作为IndexError时出错:列表索引超出范围

我尝试了以下代码:

data="/Users/sk/Documents/GitRepository/Udemy_BigData_spark/1.txt"
input = sc.textFile(data)
results = input.map(lambda x: (x.split(‘\t')[3].encode("utf-8").replace('"', '').replace("'", '')))results.take(20)

这将产生以下结果:

['Comedy', 'Comedy', 'Entertainment', 'People & Blogs', 'People &
Blogs', 'Music', 'Comedy', 'People & Blogs', 'Entertainment',
'Entertainment', 'Entertainment', 'Entertainment', 'Entertainment',
'Entertainment', 'Entertainment', 'Entertainment', 'Entertainment',
'Entertainment', 'Entertainment', 'Entertainment']


results=results.map(lambda x: (x,1))

这将产生以下结果:

[('Comedy', 1), ('Comedy', 1), ('Entertainment', 1), ('People & Blogs', 1), ('People & Blogs', 1), ('Music', 1), ('Comedy', 1), ('People & Blogs', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1), ('Entertainment', 1)]


results=results.reduceByKey(lambda x, y: x + y)    
results.take(20)

这带来了一个巨大的错误:(

我希望它向我显示如下结果:

(179049,Music), (127674,Entertainment), (87818,Comedy), (73293,Film &
Animation), (67329,Sports)

1 个答案:

答案 0 :(得分:0)

我编写的代码在scala中;

val ds = Seq("A", "B", "C", "A", "B", "C", "D", 
             "E", "F", "G", "A")
  .toDF.as[String].map(x => (x, 1))
ds.groupByKey(x => x._1)
  .reduceGroups((l, r) => (l._1, l._2+r._2))
  .show

输出:

+-----+------------------------------+
|value|ReduceAggregator(scala.Tuple2)|
+-----+------------------------------+
|    F|                        [F, 1]|
|    E|                        [E, 1]|
|    B|                        [B, 2]|
|    D|                        [D, 1]|
|    C|                        [C, 2]|
|    A|                        [A, 3]|
|    G|                        [G, 1]|
+-----+------------------------------+