为什么运行时依赖在gradle中不起作用?

时间:2015-05-21 03:31:42

标签: gradle apache-spark

我有一个简单的spark应用程序和gradle 2.3,火花指南说火花库不需要捆绑,所以我在build.gradle中使用'runtime'依赖如下:

apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'scala'
apply plugin: 'maven'


repositories {

    mavenCentral()
}

dependencies {
    compile 'org.scala-lang:scala-library:2.10.5'
    runtime 'org.apache.spark:spark-core_2.10:1.3.1'
    runtime 'org.apache.spark:spark-streaming_2.10:1.3.1'
    compile 'com.datastax.spark:spark-cassandra-connector_2.10:1.2.0-rc3'

    testCompile group: 'junit', name: 'junit', version: '4.11'
}

然而,当我运行'classes'任务时,我遇到了错误。这意味着编译无法找到罐子。我也试过'提供'和'提供的编译',结果是“没有方法提供()/ providedCompile()发现”

[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:3: error: object Logging is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.{Logging, SparkContext, SparkConf}
[ant:scalac]        ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:5: error: not found: type Logging
[ant:scalac] trait DemoApp extends App with Logging {
[ant:scalac]                                ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:14: error: not found: type SparkConf
[ant:scalac]   val conf = new SparkConf(true)
[ant:scalac]                  ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/DemoApp.scala:21: error: not found: type SparkContext
[ant:scalac]   lazy val sc = new SparkContext(conf)
[ant:scalac]                     ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/cassandra/WordCountDemo.scala:3: error: object SparkContext is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkContext._
[ant:scalac]                         ^
[ant:scalac] error: bad symbolic reference. A signature in CassandraConnector.class refers to type Logging
[ant:scalac] in package org.apache.spark which is not available.
[ant:scalac] It may be completely missing from the current classpath, or the version on
[ant:scalac] the classpath might be incompatible with the version used when compiling CassandraConnector.class.
[ant:scalac] error: bad symbolic reference. A signature in CassandraConnector.class refers to type SparkConf
[ant:scalac] in package org.apache.spark which is not available.
[ant:scalac] It may be completely missing from the current classpath, or the version on
[ant:scalac] the classpath might be incompatible with the version used when compiling CassandraConnector.class.
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:3: error: object SparkConf is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkConf
[ant:scalac]        ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:4: error: object SparkContext is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.SparkContext
[ant:scalac]        ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:5: error: object rdd is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.rdd.PairRDDFunctions
[ant:scalac]                         ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:10: error: not found: type SparkConf
[ant:scalac]         val conf = new SparkConf
[ant:scalac]                        ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:14: error: not found: type SparkContext
[ant:scalac]         val sc = new SparkContext(conf)
[ant:scalac]                      ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/PairRDDTest.scala:18: error: not found: type PairRDDFunctions
[ant:scalac]         val func = new PairRDDFunctions(rdd)
[ant:scalac]                        ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:3: error: object SparkConf is not a member of package org.apache.spark
[ant:scalac] import org.apache.spark.{SparkConf, SparkContext}
[ant:scalac]        ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:7: error: not found: type SparkConf
[ant:scalac]       val conf = new SparkConf
[ant:scalac]                      ^
[ant:scalac] /Users/grant/programming/ideaprojects/scalaTest/src/main/scala/com/grant/spark/SparkMain.scala:11: error: not found: type SparkContext
[ant:scalac]       val sc = new SparkContext(conf)
[ant:scalac]                    ^
[ant:scalac] 16 errors found

2 个答案:

答案 0 :(得分:1)

在某个插件或您的构建脚本创建它们之前,

provided / providedCompile配置不存在。你可以使用nebula-plugins中的插件或者像你这样做

configurations {
  provided
}
sourceSets {
  main {
    compileClasspath += [configurations.provided]
  }
}
dependencies {
  provided 'org.apache.hadoop:hadoop-core:2.5.0-mr1-cdh5.3.0'
  compile ...
  testCompile 'org.apache.mrunit:mrunit:1.0.0'
}
jar {
  doFirst {
    into('lib') { from configurations.runtime }
  }
}
idea {
  module {
    scopes.PROVIDED.plus += [configurations.provided]
  }
}

此示例还将库中的库添加到JAR的lib文件夹中,以便将它们作为Hadoop作业运行。对于Spark,您可能需要创建带阴影的JAR或" uber"罐。这将添加编译依赖项(未提供)。

答案 1 :(得分:0)

查看gradle文档(https://docs.gradle.org/current/userguide/artifact_dependencies_tutorial.html)我认为运行时意味着它在运行时唯一需要。我相信如果你把它切换到编译它也将被包含在运行时(措辞有点令人困惑,但如果你看一下testCompile的措辞,它或者意味着测试中包含的所有内容都是在生产代码中提供的 - 这会非常不标准 - 或者它意味着默认情况下testCompile包含所有编译元素,就像运行时一样。)