我想使用以下命令行在EC2上启动一个火花簇:
python spark_ec2.py --key-pair=alexistest2 --identity-file=C:\User\Alexis\Downloads\alexistest2.pem --region=us-west-2 --instance-type=t2.medium --spark-version=1.2.0 launch my-spark-cluster
在“等待群集中的所有实例进入'ssh-ready'状态”后,发生以下错误。
Waiting for all instances in cluster to enter 'ssh-ready' state.Traceback (most
recent call last):
File "spark_ec2.py", line 1083, in <module>
main()
File "spark_ec2.py", line 1075, in main
real_main()
File "spark_ec2.py", line 931, in real_main
opts=opts
File "spark_ec2.py", line 640, in wait_for_cluster_state
is_cluster_ssh_available(cluster_instances, opts):
File "spark_ec2.py", line 611, in is_cluster_ssh_available
if not is_ssh_available(host=i.ip_address, opts=opts):
File "spark_ec2.py", line 602, in is_ssh_available
stderr=devnull
File "C:\Users\Alexis\Anaconda\lib\subprocess.py", line 535, in check_call
retcode = call(*popenargs, **kwargs)
File "C:\Users\Alexis\Anaconda\lib\subprocess.py", line 522, in call
return Popen(*popenargs, **kwargs).wait()
File "C:\Users\Alexis\Anaconda\lib\subprocess.py", line 710, in __init__
errread, errwrite)
File "C:\Users\Alexis\Anaconda\lib\subprocess.py", line 958, in _execute_child
startupinfo)
WindowsError: [Error 2] The system cannot find the file specified
我已经检查了我的Python文件夹,并且“C:\ Users \ Alexis \ Anaconda \ Lib”中存在subprocess.py文件。
我已经编辑了spark_ec2以便add UserKnownHostsFile=/dev/null。
仍然,我得到同样的错误。有什么想法吗?
答案 0 :(得分:1)
我认为这与你的道路有关,你写了“C:/ User”,也许它应该是“C:/ Users”。
答案 1 :(得分:0)
可能是区分大小写的问题?
脚本正在寻找“C:\ Users \ Alexis \ Anaconda \ lib \ subprocess.py”
你可以在“C:\ Users \ Alexis \ Anaconda \ Lib \ subprocess.py”中找到它。
尝试将目录名称更改为“lib”而不是“Lib”