在我的Windows系统上运行pyspark发生Py4JJavaError

时间:2019-04-04 04:19:22

标签: windows apache-spark pyspark

我在win10系统上运行pyspark,使用counts.saveAsTextFile("wc")时出错了

我已经在笔记本电脑上设置了JAVA,Hadoop,scala,winutils的环境变量,

counts.saveAsTextFile("wc")
Py4JJavaError: An error occurred while calling o125.saveAsTextFile.
: org.apache.spark.SparkException: Job aborted.
at org.apache.spark.internal.io.SparkHadoopWriter$.write(SparkHadoopWriter.scala:100)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply$mcV$sp(PairRDDFunctions.scala:1096)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1094)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1.apply(PairRDDFunctions.scala:1094)

0 个答案:

没有答案