启动spark-shell时,出现以下错误。
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
**18/04/25 07:18:41 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 10.250.54.201:7077**
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
答案 0 :(得分:0)
请尝试按照以下步骤解决问题
在命令提示符中转到%SPARK_HOME%\bin
文件夹
运行spark-class org.apache.spark.deploy.master.Master
以运行主服务器。这会为您提供spark://ip:port
运行s park-class org.apache.spark.deploy.worker.Worker spark://ip:port
来运行worker。确保使用在步骤2中获得的URL。
运行spark-shell --master spark://ip:port
将应用程序连接到新创建的集群。