java.lang.IllegalArgumentException:port超出范围:458964785

时间:2016-03-20 18:53:30

标签: java python pyspark

我只是想在命令行上运行简单的东西,比如:

lines = sc.textFile('shakespeare.txt')
lines.count()

我得到了这个追溯:

[Stage 0:>                                                          (0 + 2) / 2]16/03/20 12:51:38 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.IllegalArgumentException: port out of range:458964785
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:76)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:91)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:63)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:134)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
16/03/20 12:51:38 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.lang.IllegalArgumentException: port out of range:458964785
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:76)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:91)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:63)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:134)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
034h16/03/20 12:51:38 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, localhost): java.lang.IllegalArgumentException: port out of range:458964785
    at java.net.InetSocketAddress.checkPort(InetSocketAddress.java:143)
    at java.net.InetSocketAddress.<init>(InetSocketAddress.java:188)
    at java.net.Socket.<init>(Socket.java:244)
    at org.apache.spark.api.python.PythonWorkerFactory.createSocket$1(PythonWorkerFactory.scala:76)
    at org.apache.spark.api.python.PythonWorkerFactory.liftedTree1$1(PythonWorkerFactory.scala:91)
    at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:90)
    at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:63)
    at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:134)
    at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:101)
    at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

16/03/20 12:51:38 ERROR TaskSetManager: Task 1 in stage 0.0 failed 1 times; aborting job
Traceback (most recent call last):
  Fi

它运行正常,然后我突然在IPython和普通Pyspark中都出现了这个错误。新安装的火花会解决这个问题吗,或者我需要做什么?

0 个答案:

没有答案