例外:ERROR SparkContext - 初始化本地SparkContext时出错。 java.net.BindException

时间:2016-07-24 18:22:18

标签: scala testing intellij-idea apache-spark

我试图为spark应用程序编写一个测试,但是在尝试运行下一个测试时我得到了这个例外

     class BasicIT {

      val sparkConf: SparkConf = new SparkConf().setAppName("basic.phase.it").setMaster("local[1]")
      var context:SparkContext = new SparkContext(sparkConf)
    @Test
    def myTest(): Unit = {
      print("test")
     }
    }

因此例外而失败:

2016-07-24 21:04:39,956 [main,95] ERROR SparkContext - Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!

目前在OS x Yosemite上使用IntelliJ。

我做错了什么?用来工作的代码相同..

3 个答案:

答案 0 :(得分:3)

尝试使用 导出SPARK_LOCAL_IP =" 127.0.0.1" to load-spark-env.sh或只设置SPARK_LOCAL_IP =" 127.0.0.1"在运行spark应用程序之前它对我有用。

答案 1 :(得分:2)

尝试将 spark.driver.host 添加为您的localhost

SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("AnyName").set("spark.driver.host", "localhost");

答案 2 :(得分:0)

您可能还有一些日志记录告诉您已经使用了配置指定的UI端口。如果是这种情况,则需要将spark.ui.port明确地设置为您知道将成为主服务器上的可用端口的某个值。当特定端口不可用时,Spark会尝试增加端口号。

示例:

val sparkConf = new SparkConf().setAppName("basic.phase.it")
                               .setMaster("local[1]")
                               .set("spark.ui.port", "4080");