Spark:为什么流式传输无法连接java套接字客户端

时间:2014-04-17 08:42:44

标签: apache-spark

我正在研究Spark流处理实时数据,我构建了示例wordCount of spark streaming,我可以运行以下示例:   / bin / run-example org.apache.spark.streaming.examples.JavaNetworkWordCount local [2] localhost 9999

我在另一个终端运行“nc -L -p 9999”,然后我可以在这个终端输入字母,示例可以收到这些字母并给出正确的结果。

但是我开发了一个java socket客户端来发送内容到9999端口,为什么这个例子不能接收呢?我认为这个例子只是监视9999端口,并从端口接收任何东西。

以下是java部分:

    File file = new File("D:\\OutputJson.dat");
    long l = file.length();
    socket = new Socket();
    boolean connected = false;
    while (!connected) {
        //not stop until send successful
        try {
            socket.connect(new InetSocketAddress("localhost", 9999));
            connected = true;
            System.out.println("connected success!");
        } catch (Exception e) {
            e.printStackTrace();
            System.out.println("connected failed!");
            Thread.sleep(5000);
        }
    }
    dos = new DataOutputStream(socket.getOutputStream());
    fis = new FileInputStream(file);
    sendBytes = new byte[1024];
    while ((length = fis.read(sendBytes, 0, sendBytes.length)) > 0) {
        sumL += length;
        System.out.println("sent:" + ((sumL / l) * 100) + "%");
        dos.write(sendBytes, 0, length);
        dos.flush();
    }
    if (sumL == l) {
        bool = true;
    }

这个java函数总是返回错误:    java.net.SocketException:Socket关闭

我开发了另一个java类来接收来自这个发送套接字的数据,它工作正常,为什么火花无法接收?

1 个答案:

答案 0 :(得分:-1)

从内存中我认为我使用的是ServerSocket。代码类似于:

public void sendMsg(String msg) throws IOException {
    ServerSocket serverSocket = null;
    Socket clientSocket = null;
    try {
        serverSocket = new ServerSocket(port);
        clientSocket = serverSocket.accept();
        PrintWriter out = new PrintWriter(clientSocket.getOutputStream(), true);
        out.write(msg);
        out.flush();
        out.close();
    } finally {
        try {
            clientSocket.close();
            serverSocket.close();
        } finally {
            clientSocket = null;
            serverSocket = null;
        }
    }
}