Apache Spark - JavaSparkContext无法转换为SparkContext错误

时间:2015-09-21 11:12:15

标签: java apache-spark

我在将Spark示例转换为可运行代码时遇到了相当大的困难(正如我之前的问题here所证明的那样)。

那里提供的答案帮助我了解了这个特定的例子,但现在我正在尝试the Multilayer Perceptron example并立即尝试我遇到错误。

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel;
import org.apache.spark.ml.classification.MultilayerPerceptronClassifier;
import org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator;
import org.apache.spark.ml.param.ParamMap;
import org.apache.spark.mllib.regression.LabeledPoint;
import org.apache.spark.mllib.util.MLUtils;
import org.apache.spark.mllib.linalg.Vectors;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;

// Load training data
public class SimpleANN {
  public static void main(String[] args) {
    String path = "file:/usr/local/share/spark-1.5.0/data/mllib/sample_multiclass_classification_data.txt";
    SparkConf conf = new SparkConf().setAppName("Simple ANN");
    JavaSparkContext sc = new JavaSparkContext(conf);
    JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();
  ...
  ...
  }
}

我收到以下错误

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project simple-ann: Compilation failure
[ERROR] /Users/robo/study/spark/ann/src/main/java/SimpleANN.java:[23,61] incompatible types: org.apache.spark.api.java.JavaSparkContext cannot be converted to org.apache.spark.SparkContext

1 个答案:

答案 0 :(得分:4)

如果您需要JavaSparkContext中的SparkContext,可以使用静态方法:

JavaSparkContext.toSparkContext(youJavaSparkContextBean)

所以你必须从

修改你的代码
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();

JavaSparkContext sc = new JavaSparkContext(conf);
    JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(
                                    JavaSparkContext.toSparkContext(sc),
                                    path).toJavaRDD();