I have a yarn cluster and another server only used to submit spark jobs to the cluster.
But there are multiple versions of scala say 2.10.x and 2.11.x, also multiple version of spark like 1.6.x,2.0.x.2.1.x.
Is there a way to manage different versions on one server gracefully?
答案 0 :(得分:0)
当你向yarn提交内容时,只要它知道在哪里获取字节码就可以了解运行的内容。只需确保在两个数据节点上,在同一路径中安装了所有内容,并从正确的版本提交。我从未尝试过码头工人,但是我在工作中谈到的码头工人似乎并不认为这是个好主意。