火花壳无响应

时间:2020-06-11 17:20:00

标签: docker apache-spark

我使用以下yml file使用docker-compose启动了一个Spark集群:

spark-master:
  image: bde2020/spark-master:2.4.5-hadoop2.7
  container_name: spark-master
  ports:
    - "8080:8080"
    - "7077:7077"
  environment:
    - INIT_DAEMON_STEP=setup_spark
    - "constraint:node==<yourmasternode>"
spark-worker-1:
  image: bde2020/spark-worker:2.4.5-hadoop2.7
  container_name: spark-worker-1
  depends_on:
    - spark-master
  ports:
    - "8081:8081"
  environment:
    - "SPARK_MASTER=spark://spark-master:7077"
    - "constraint:node==<yourworkernode>"

Spark-master和Spark-worker-1正确启动。

之后,我想连接到Spark-shell。为此,我键入了以下命令:

sudo docker exec -t spark-master ./spark/bin/spark-shell

shell启动,当我键入命令时,什么也没有发生,例如:

Scala> :help

我等待了很长时间,但是按Enter键却什么也没发生。

有什么问题吗?

0 个答案:

没有答案