我从官方网站下载了Apache Spark 1.4.1。如下:
我的机器中没有安装hadoop
。
Apache提供构建命令。所以,我尝试使用以下命令开始构建项目
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
但是构建失败并出现以下错误:
[INFO] Spark Project External Kafka Assembly ............. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................ SKIPPED
[INFO] --------------------------------------------------------------------- ---
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.840s
[INFO] Finished at: Wed Jul 29 10:43:04 IST 2015
[INFO] Final Memory: 15M/43M
[INFO] ------------------------------------------------------------------------
[ERROR] Plugin org.apache.maven.plugins:maven-enforcer-plugin:1.4 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.apache.maven.plugins:maven-enforcer-plugin:jar:1.4: Could not transfer artifact org.apache.maven.plugins:maven-enforcer-plugin:pom:1.4 from/to central (https://repo1.maven.org/maven2): peer not authenticated -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
我是Apache Spark
的新人,请提出建议。
答案 0 :(得分:0)
因为它是您已下载的预编译二进制分发版。你不需要使用maven2再次编译它。只需将其放在路径上并直接使用它。