无法使用Spark内核运行Jupyter Notebook

时间:2019-11-15 16:41:10

标签: apache-spark jupyter-notebook

我想运行带有Spark内核的Jupyter笔记本。

我已经下载并安装了spylon-kernel软件包

$ pip3 install spylon-kernel
$ python -m spylon_kernel install --user

启动笔记本后,似乎已经正确安装,直到尝试执行单元为止。然后,我收到以下我不完全理解的错误。

[MetaKernelApp] ERROR | Exception in message handler:
Traceback (most recent call last):
  File "/home/default/.local/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 268, in dispatch_shell
    yield gen.maybe_future(handler(stream, idents, msg))
  File "/home/default/.local/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
    value = future.result()
  File "/home/default/.local/lib/python3.6/site-packages/tornado/gen.py", line 209, in wrapper
    yielded = next(result)
  File "/home/default/.local/lib/python3.6/site-packages/ipykernel/kernelbase.py", line 541, in execute_request
    user_expressions, allow_stdin,
  File "/home/default/.local/lib/python3.6/site-packages/metakernel/_metakernel.py", line 395, in do_execute
    retval = self.do_execute_direct(code)
  File "/home/default/.local/lib/python3.6/site-packages/spylon_kernel/scala_kernel.py", line 141, in do_execute_direct
    res = self._scalamagic.eval(code.strip(), raw=False)
  File "/home/default/.local/lib/python3.6/site-packages/spylon_kernel/scala_magic.py", line 157, in eval
    intp = self._get_scala_interpreter()
  File "/home/default/.local/lib/python3.6/site-packages/spylon_kernel/scala_magic.py", line 46, in _get_scala_interpreter
    self._interp = get_scala_interpreter()
  File "/home/default/.local/lib/python3.6/site-packages/spylon_kernel/scala_interpreter.py", line 568, in get_scala_interpreter
    scala_intp = initialize_scala_interpreter()
  File "/home/default/.local/lib/python3.6/site-packages/spylon_kernel/scala_interpreter.py", line 163, in initialize_scala_interpreter
    spark_session, spark_jvm_helpers, spark_jvm_proc = init_spark()
  File "/home/default/.local/lib/python3.6/site-packages/spylon_kernel/scala_interpreter.py", line 71, in init_spark
    conf._init_spark()
  File "/home/default/.local/lib/python3.6/site-packages/spylon/spark/launcher.py", line 479, in _init_spark
    findspark.init(spark_home=spark_home, edit_rc=False, edit_profile=False, python_path=python_path)
  File "/home/default/.local/lib/python3.6/site-packages/findspark.py", line 135, in init
    py4j = glob(os.path.join(spark_python, 'lib', 'py4j-*.zip'))[0]
IndexError: list index out of range

您能建议我如何处理吗?

0 个答案:

没有答案