Gradle:在Eclipse中使用Apache Spark设置Scala项目

时间:2016-10-21 19:55:48

标签: eclipse scala apache-spark gradle

我无法在Eclipse中设置具有Apache Spark依赖性的Scala项目。在Eclipse中使用Scala IDE插件和Gradle插件。 input_data, targets = reader.ptb_producer(train_data, 20, 25) cell = tf.nn.rnn_cell.BasicLSTMCell(200, forget_bias=1.0, state_is_tuple=True) initial_state = cell.zero_state(20, tf.float32) embedding = tf.get_variable("embedding", [10000, 200], dtype=tf.float32) inputs = tf.nn.embedding_lookup(embedding, input_data) input_data_train # <tf.Tensor 'PTBProducer/Slice:0' shape=(20, 25) dtype=int32> inputs # <tf.Tensor 'embedding_lookup:0' shape=(20, 25, 200) dtype=float32> outputs = [] state = initial_state for time_step in range(25): if time_step > 0: tf.get_variable_scope().reuse_variables() cell_output, state = cell(inputs[:, time_step, :], state) outputs.append(cell_output) output = tf.reshape(tf.concat(1, outputs), [-1, 200]) outputs # list of 20: <tf.Tensor 'BasicLSTMCell/mul_2:0' shape=(20, 200) dtype=float32> output # <tf.Tensor 'Reshape_2:0' shape=(500, 200) dtype=float32> softmax_w = tf.get_variable("softmax_w", [config.hidden_size, config.vocab_size], dtype=tf.float32) softmax_b = tf.get_variable("softmax_b", [config.hidden_size, config.vocab_size], dtype=tf.float32) logits = tf.matmul(output, softmax_w) + softmax_b loss = tf.nn.seq2seq.sequence_loss_by_example([logits], [tf.reshape(targets, [-1])],[tf.ones([20*25], dtype=tf.float32)]) cost = tf.reduce_sum(loss) / batch_size 项目看起来像这样:

build.gradle

apply plugin: 'scala' apply plugin: 'eclipse' repositories{ mavenCentral() mavenLocal() } dependencies{ compile 'org.slf4j:slf4j-api:1.7.5' compile "org.scala-lang:scala-library:2.11.2" compile 'com.sparkjava:spark-core:2.3' testCompile "junit:junit:4.11" } task run(type: JavaExec, dependsOn: classes) { main = 'Main' classpath sourceSets.main.runtimeClasspath classpath configurations.runtime } 下,我可以看到spark-core-2.3.jar。但我无法将任何Spark库导入Scala类 我确实试过运行Referenced Libraries命令,但没有运气。

1 个答案:

答案 0 :(得分:3)

您引用了错误的依赖项 - 而不是com.sparkjava:spark-core:2.3(属于另一个项目Spark web framework),您应该包含:

compile 'org.apache.spark:spark-core_2.11:2.0.1'

这使用最新的稳定版本(2.0.1)。