在Windows 10上安装ApacheSpark后出现Java问题

时间:2018-12-16 17:51:58

标签: apache-spark pyspark installation

我已经在Windows 10上安装了spark。 我正在尝试执行以下代码

sc = SparkContext.getOrCreate()
import numpy as np
TOTAL = 100
dots = sc.parallelize([2.0 * np.random.random(2) - 1.0 for i in 
range(TOTAL)]).cache()
print("Number of random points:", dots.count())

在执行上述代码时,我遇到以下错误 我尝试授予对所有目录的访问权限和完全控制权限。

 ERROR Executor:91 - Exception in task 4.0 in stage 0.0 (TID 4)
 java.io.IOException: Cannot run program "C:\spark\spark-2.4.0-bin-hadoop2.7\python\pyspark": CreateProcess error=5, Access is denied
         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
         at org.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:155)
         at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:97)
         at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:117)
         at org.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:108)
         at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65)
         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
         at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
         at org.apache.spark.scheduler.Task.run(Task.scala:121)
         at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
         at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
         at java.lang.Thread.run(Thread.java:748)
 Caused by: java.io.IOException: CreateProcess error=5, Access is denied

0 个答案:

没有答案