我有一台用于日常工作的W7机器。我的公司在私有云上也有一个空气密封的Hadoop集群。我只能通过putty访问云。 当我想在群集上使用Spark时,我会启动putty,然后执行以下两项操作之一:
有没有办法使用我当地的W7 IPython笔记本连接到Spark?
Daniel Darabos发表评论之后的编辑和错误编辑
我在this tutorial之后在我的W7机器上本地安装了Spark。 然后,我创建了一个新的pyspark配置文件,并在this tutorial之后更改了启动文件。 此时我可以在本地启动Ipython并成功创建一个spark上下文。 我跑的时候:
sc.stop()
conf = SparkConf().setAppName('SPark Test').setMaster('localhost:7077')
sc = SparkContext(conf=conf)
然后我收到错误:
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
<ipython-input-15-1e8f5b112924> in <module>()
1 sc.stop()
2 conf = SparkConf().setAppName('SPark Test').setMaster('localhost:7077')
----> 3 sc = SparkContext(conf=conf)
C:\Spark\python\pyspark\context.pyc in __init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
111 try:
112 self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
--> 113 conf, jsc, profiler_cls)
114 except:
115 # If an error occurs, clean up in order to allow future SparkContext creation:
C:\Spark\python\pyspark\context.pyc in _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, jsc, profiler_cls)
168
169 # Create the Java SparkContext through Py4J
--> 170 self._jsc = jsc or self._initialize_context(self._conf._jconf)
171
172 # Create a single Accumulator in Java that we'll send all our updates through;
C:\Spark\python\pyspark\context.pyc in _initialize_context(self, jconf)
222 Initialize SparkContext in function to allow subclass specific initialization
223 """
--> 224 return self._jvm.JavaSparkContext(jconf)
225
226 @classmethod
C:\Spark\python\lib\py4j-0.8.2.1-src.zip\py4j\java_gateway.py in __call__(self, *args)
699 answer = self._gateway_client.send_command(command)
700 return_value = get_return_value(answer, self._gateway_client, None,
--> 701 self._fqn)
702
703 for temp_arg in temp_args:
C:\Spark\python\pyspark\sql\utils.pyc in deco(*a, **kw)
34 def deco(*a, **kw):
35 try:
---> 36 return f(*a, **kw)
37 except py4j.protocol.Py4JJavaError as e:
38 s = e.java_exception.toString()
C:\Spark\python\lib\py4j-0.8.2.1-src.zip\py4j\protocol.py in get_return_value(answer, gateway_client, target_id, name)
298 raise Py4JJavaError(
299 'An error occurred while calling {0}{1}{2}.\n'.
--> 300 format(target_id, '.', name), value)
301 else:
302 raise Py4JError(
Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: org.apache.spark.SparkException: Could not parse Master URL: 'localhost:7077'
at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2693)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:506)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Unknown Source)
答案 0 :(得分:2)
使用PuTTY创建SSH隧道,将本地端口(例如7077
)转发给Spark主服务器(例如spark-master:7077
)。然后在本地IPython Notebook中使用本地端口(spark://localhost:7077
)作为Spark主站的地址。