试图让Datastax Spark Master收听公共IP

时间:2015-01-06 09:13:10

标签: apache-spark datastax

我想在不同主机上运行我的Java客户端和Spark Master提交Spark作业。 我尝试将spark-env.sh中的SPARK_MASTER_IP和SPARK_LOCAL_IP设置为公共IP,但sparkMaster始终绑定到127.0.0.1。 如何将Spark Master绑定到我的公共IP?

更新14/1/15:
您可以查看我的spark-env.sh文件in pastebin

ifconfig(在spark master上)输出:

ens32: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500
    inet 10.76.28.117  netmask 255.255.254.0  broadcast 10.76.29.255
    inet6 fe80::250:56ff:fe87:d94  prefixlen 64  scopeid 0x20<link>
    ether 00:50:56:87:0d:94  txqueuelen 1000  (Ethernet)
    RX packets 4392194  bytes 1030005387 (982.2 MiB)
    RX errors 0  dropped 2828  overruns 0  frame 0
    TX packets 556316  bytes 114056485 (108.7 MiB)
    TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0
lo: flags=73<UP,LOOPBACK,RUNNING>  mtu 65536
    inet 127.0.0.1  netmask 255.0.0.0
    inet6 ::1  prefixlen 128  scopeid 0x10<host>
    loop  txqueuelen 0  (Local Loopback)
    RX packets 810654  bytes 602987986 (575.0 MiB)
    RX errors 0  dropped 0  overruns 0  frame 0
    TX packets 810654  bytes 602987986 (575.0 MiB)
    TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

1 个答案:

答案 0 :(得分:0)

我将cassandra.yaml中的rpc_address设置为&#34; 0.0.0.0&#34;。将其更改为其工作的节点的真实IP地址后。