Does anyone know if it is possible to build vanilla spark v1.6.3 with Hadoop 2.7? I am asking since I need to deploy a version of Spark compiled with Scala 2.11 to a cluster that is already setup with Hadoop 2.7.
Any suggestions about how to proceed with building Spark are more than welcome and i can of course share any details of my build process if needed.
Thanks a lot!