如何在Java方法中使用Scala代码片段? (Spark的例子)

时间:2016-09-01 15:25:58

标签: java scala

我想从Java调用Scala代码。但是我只能操作代码的特定部分(下面示例中apply方法中的代码)。另外,我可以将JAR添加到类路径中。

实施例

纯Java代码:

// system imports
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.*;
import org.apache.spark.sql.types.*;
import org.apache.spark.sql.*;
import com.knime.bigdata.spark.core.exception.*;
import com.knime.bigdata.spark1_6.api.RowBuilder;
import com.knime.bigdata.spark1_6.jobs.scripting.java.AbstractSparkJavaSnippet;
import com.knime.bigdata.spark1_6.jobs.scripting.java.AbstractSparkJavaSnippetSource;
import com.knime.bigdata.spark1_6.jobs.scripting.java.AbstractSparkJavaSnippetSink;

// Your custom imports:

// system variables
public class SparkJavaSnippet extends AbstractSparkJavaSnippet {
    private static final long serialVersionUID = 1L;


// Your custom variables:

// expression start
    public JavaRDD<Row> apply(final JavaSparkContext sc, final JavaRDD<Row> rowRDD1, final JavaRDD<Row> rowRDD2) throws Exception {

    //*************************************************
    //Specify the fraction of data to sample and if
    //the sampling should be performed with replacement
    //*************************************************
    final double fraction = 0.5;
    final boolean withReplacement = false;  
    //final boolean withReplacement = true;


    //Returns a sample of the incoming RDD
    return rowRDD1.sample(withReplacement, fraction);

// expression end
    }
}

部分Scala代码(不工作):

// system imports
import org.apache.spark.SparkContext;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.*;
import org.apache.spark.sql.types.*;
import org.apache.spark.sql.*;
import com.knime.bigdata.spark.core.exception.*;
import com.knime.bigdata.spark1_6.api.RowBuilder;
import com.knime.bigdata.spark1_6.jobs.scripting.java.AbstractSparkJavaSnippet;
import com.knime.bigdata.spark1_6.jobs.scripting.java.AbstractSparkJavaSnippetSource;
import com.knime.bigdata.spark1_6.jobs.scripting.java.AbstractSparkJavaSnippetSink;

// Your custom imports:

// system variables
public class SparkJavaSnippet extends AbstractSparkJavaSnippet {
    private static final long serialVersionUID = 1L;

// Your custom variables:

// expression start
    public JavaRDD<Row> apply(final JavaSparkContext sc, final JavaRDD<Row> rowRDD1, final JavaRDD<Row> rowRDD2) throws Exception {

    //Scala code begins here
    val fraction = 0.5
    val withReplacement = false
    rowRDD1.sample(withReplacement, fraction)
    //Scala code ends here

// expression end
    }
}

问题

我怎样写//Scala code starts here//Scala code ends here,使得我可以使用Scala代码之间的代码在那里 - 嵌入在Java代码

我无法更改这些评论之外的代码!但我可以将JAR添加到类路径中!

0 个答案:

没有答案