我想使用RDD.map进行一些操作,它可以在火花纱中工作。但是,向其添加for
循环时会产生错误。我想知道为什么 和 如何解决。
当我添加for (walkCount...)
时,spark会产生以下错误:
java.io.FileNotFoundException: /home/xxx/usr/hadoop-2.7.3/tmp/nm-local-dir/usercache/xxx/appcache/application_1554174196597_0019/blockmgr-ac0eb809-641a-437a-a2f0-223084771848/1f/temp_shuffle_303f490a-6e1b-46a2-ae98 -3e3460218bbf (打开的文件太多)
...
19/04/02 19:41:53错误TransportResponseHandler:仍然有3个 从node6 / ip:40762的连接关闭时,请求未完成 19/04/02 19:41:53 INFO RetryingBlockFetcher:正在重试获取(1/3) ...之后有1个出色的街区...
代码在下面。它无需for (walkCount...)
即可使用。
def randomWalk(): RDD[Array[Long]]= {//get a sequence of nodes(Long type) from a multilayer graph
var randomWalk = initialWalk.map { case (nodeId, clickNode) =>
...
(nodeId, pathBuffer, layer)//nodeId:Long;pathBuffer:ArrayBuffer[Long];layer:Int
}.persist(persistLevel)//this part is no problem
for (walkCount <- 0 until 60) {//without this for loop, it works
randomWalk = randomWalk.map { case (nodeId, pathBuffer, layer) =>
val prevNodeId = pathBuffer(pathBuffer.length - 2)//the last two node
val currentNodeId = pathBuffer.last//the last node
(s"$prevNodeId $currentNodeId", (nodeId, pathBuffer, layer))
}.join(indexedEdges).map { case (edge, ((nodeId, pathBuffer, currentLayer), dstNeighbors)) =>//indexedEdges is RDD[(s"$prevNodeId $currentNodeId", dstNeighbors(currentNodeId's neighbors))]
try {
//dstNeighbors:Array[(neighborId:Long, layer:Int, weight:Double, tal:Double)]
val lastNode = pathBuffer.last
val nextNode = Graphops.produceNode(dstNeighbors, currentLayer, lastNode)//Array[(nextNodeId:Long, newLayer:Int)], size is 1; choose next node, also consider whether changes layer, this function involves math.random and a constant, Graphops.q
require(nextNode.length == 1, "nextNode.length != 1")
pathBuffer.append(nextNode(0)._1)
(nodeId, pathBuffer, nextNode(0)._2)
} catch {
case e: Exception => throw new RuntimeException(e.getMessage)
}
}.persist(persistLevel)
}//this correspond to for loop
randomWalk.map(_._2.toArray)
}