在scala中,我试图将一个巨大的全局变量传递给map操作,spark提示我提供信息:
ERROR yarn.ApplicationMaster: User class threw exception: java.lang.StackOverflowError
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
代码就像:
val data = sc.textFile(inputPath).cache()
val map = Map[String, Int]()
for (i <- 0 to 9) {
map(i.toString) = i
}
data.map(sample => {
if (map.contains(sample)) {
("Found")
}
else {
("Not found")
}
})
答案 0 :(得分:0)
它已经解决了,我调整了数据类型来自&#34;数组[Map [String,Int]]&#34;进入&#34; Map [String,Int]&#34;