Spark 1.2.1无法编译汇编项目

时间:2015-02-20 01:21:22

标签: hadoop apache-spark hive apache-spark-1.2

刚刚下载了Spark 1.2.1,它无法在汇编项目中编译,并出现以下错误:

The requested profile "hadoop-2.6" could not be activated because it does not exist.
[ERROR] Failed to execute goal on project spark-assembly_2.10: Could not resolve dependencies for project org.apache.spark:spark-assembly_2.10:pom:1.2.1: Failure to find org.apache.spark:spark-hive-thriftserver_2.11:jar:1.2.1

这是环境:

  1. Hadoop 2.6.0 - 通过brew安装
  2. Hive 0.14.0 - 通过brew安装
  3. Spark 1.2.1下载为tgz,因为Brew抱怨beeline是一个常见的二进制文件
  4. Scala 2.11 - 通过brew安装
  5. sbt 0.13.7 - 通过brew安装
  6. 我正在使用以下参数编译spark分布: mvn -Pyarn -Phadoop-2.6 -Dhadoop.version = 2.6.0 -Phive -Phive-thriftserver -Dscala-2.11 -DskipTests clean package

    Reactor Summary:
    [INFO] 
    [INFO] Spark Project Parent POM .......................... SUCCESS [  3.525 s]
    [INFO] Spark Project Core ................................ SUCCESS [02:56 min]
    [INFO] Spark Project Bagel ............................... SUCCESS [ 17.102 s]
    [INFO] Spark Project GraphX .............................. SUCCESS [ 45.246 s]
    [INFO] Spark Project ML Library .......................... SUCCESS [01:22 min]
    [INFO] Spark Project Tools ............................... SUCCESS [ 11.457 s]
    [INFO] Spark Project Networking .......................... SUCCESS [  6.121 s]
    [INFO] Spark Project Shuffle Streaming Service ........... SUCCESS [  5.642 s]
    [INFO] Spark Project Streaming ........................... SUCCESS [01:19 min]
    [INFO] Spark Project Catalyst ............................ SUCCESS [01:27 min]
    [INFO] Spark Project SQL ................................. SUCCESS [01:19 min]
    [INFO] Spark Project Hive ................................ SUCCESS [01:20 min]
    [INFO] Spark Project Assembly ............................ FAILURE [  0.396 s]
    [INFO] Spark Project External Twitter .................... SKIPPED
    [INFO] Spark Project External Flume ...................... SKIPPED
    [INFO] Spark Project External Flume Sink ................. SKIPPED
    [INFO] Spark Project External MQTT ....................... SKIPPED
    [INFO] Spark Project External ZeroMQ ..................... SKIPPED
    [INFO] Spark Project Examples ............................ SKIPPED
    [INFO] Spark Project REPL ................................ SKIPPED
    [INFO] Spark Project YARN Parent POM ..................... SKIPPED
    [INFO] Spark Project YARN Stable API ..................... SKIPPED
    [INFO] Spark Project YARN Shuffle Service ................ SKIPPED
    [INFO] Spark Project Hive Thrift Server .................. SKIPPED
    

    我错过了什么吗?我不想使用brew安装Apache-Spark,因为我必须取消链接hive,我也想使用它。

    谢谢!

1 个答案:

答案 0 :(得分:1)

尝试使用hadoop-2.4个人资料,但保留其他Hadoop版本为2.6.0:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -Dscala-2.11 -DskipTests clean package

来源:https://mail-archives.apache.org/mod_mbox/spark-user/201412.mbox/%3CCAMAsSdLVO73-YRa8-k_SXN5kkDKML-nKpPhgnQ3TgxUVnCp=bg@mail.gmail.com%3E