Spark提交未像Intellij中那样运行代码

时间:2019-06-20 05:48:57

标签: windows apache-spark spark-submit

下面的代码在Intellij中正常运行并显示输出。当我尝试使用命令spark-submit来运行它时:

spark-submit --class com.sohail.popular_movies_pkg C:\spark\bin\popular_movies_pkg.jar

它只是以警告终止,控制台上没有任何显示。我做错了什么还是必须包含一些内容?

C:\spark\bin>spark-submit --class com.sohail.popular_movies_pkg  popular_movies_pkg.jar
19/06/20 01:42:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
package com.sohail

/** Find the movies with the most ratings. */

import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.log4j._

object popular_movies_pkg {

  def main(args: Array[String]): Unit = {

    System.setProperty("hadoop.home.dir", "C:\\winutils\\")

    // Set the log level to only print errors
    Logger.getLogger("org").setLevel(Level.ERROR)

    // Create a SparkContext using every core of the local machine
    val sc = new SparkContext("local[*]", "popular_movies_pkg")

    // Read in each rating line
    val lines = sc.textFile("C:\\spark\\bin\\u.data")

    //data format: user id, movie id, rating, timestamp
    val movie_rating_map = lines.map(x => (x.split("\t")(1).toInt,1))

    val movie_rating_count = movie_rating_map.reduceByKey((x,y) => x+y);

    val flip = movie_rating_count.map(x => (x._2,x._1) )

    flip.sortByKey(false).collect().foreach(println)


  }

}

0 个答案:

没有答案