在Spark Streaming中运行word count程序时获取java.net.ConnectException

时间:2018-03-09 18:55:55

标签: scala apache-spark spark-streaming

我试图在火花流中运行字数统计程序,但我得到的错误。我使用的是nc -lk 9999

           import org.apache.spark._
           import org.apache.spark.streaming._
           import org.apache.spark.streaming.StreamingContext._
           import org.apache.spark.streaming.Seconds


                object wordcount{

                def main(args: Array[String]):Unit = {


         val conf = new 
      SparkConf().setMaster("local[2]").setAppName("WordCount")
      val ssc = new StreamingContext(conf, Seconds(5))

              val lines = ssc.socketTextStream("localhost",9999)

              val words = lines.flatMap(_.split(" "))
                    val pairs = words.map(word => (word, 1))
                    val wordcount = pairs.reduceByKey(_+_)

         wordcount.print()

        ssc.start()             
             ssc.awaitTermination()
  

WARN ReceiverSupervisorImpl:重启接收器延迟2000毫秒:连接到localhost:9999时出错   java.net.ConnectException:连接被拒绝:连接       at java.net.DualStackPlainSocketImpl.connect0(Native Method)       在java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:79)       在java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)       at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)       在java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)       在java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)       在java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)       在java.net.Socket.connect(Socket.java:589)       在java.net.Socket.connect(Socket.java:538)       在java.net.Socket。(Socket.java:434)       在java.net.Socket。(Socket.java:211)       在org.apache.spark.streaming.dstream.SocketReceiver.receive(SocketInputDStream.scala:73)       在org.apache.spark.streaming.dstream.SocketReceiver $$ anon $ 2.run(SocketInputDStream.scala:59)

1 个答案:

答案 0 :(得分:0)

我曾经遇到过这个问题。 将spark连接到端口999 netcat服务器但尚未启动它时会发生此异常。因此,在将spark连接到端口netcat之前,请确保您的999正在运行。

您可以查看此解决方案answer