Building spark 1.6.3 with hadoop 2.7 and scala 2.11

时间:2017-05-16 09:15:57

标签: hadoop apache-spark

Does anyone know if it is possible to build vanilla spark v1.6.3 with Hadoop 2.7? I am asking since I need to deploy a version of Spark compiled with Scala 2.11 to a cluster that is already setup with Hadoop 2.7.

Any suggestions about how to proceed with building Spark are more than welcome and i can of course share any details of my build process if needed.

Thanks a lot!

0 个答案:

没有答案