在Windows和Apache Toree Kernel上使用Jupyter以实现Spark兼容性

时间:2016-09-14 08:56:36

标签: windows scala apache-spark jupyter-notebook apache-toree

我正在尝试安装Apache Toree内核以实现火花兼容性,并且我遇到了一个奇怪的环境消息。这是我遵循的过程:

  1. 使用Jupyter 4.1.0安装最后一个Anaconda版本
  2. 运行:pip install --pre toree
  3. 运行:jupyter toree install --interpreters = PySpark,SparkR,Scala,SQL
  4. 只对Scala Kernel真正感兴趣,但我安装了所有解释器。 操作系统是Windows 7,没有选择使用虚拟机或Linux。

    这是我修改为使用cygwin执行run.sh bash脚本的kernel.json文件:

    {
      "language": "scala", 
      "display_name": "Apache Toree - Scala", 
      "env": {
        "__TOREE_SPARK_OPTS__": "", 
        "SPARK_HOME": "C:\\CDH\\spark", 
        "__TOREE_OPTS__": "", 
        "DEFAULT_INTERPRETER": "Scala", 
        "PYTHONPATH": "C:\\CDH\\spark\\python:C:\\CDH\\spark\\python\\lib\\py4j-0.8.2.1-src.zip", 
        "PYTHON_EXEC": "python"
      }, 
      "argv": [
        "C:\\cygwin64\\bin\\mintty.exe","-h","always","/bin/bash","-l","-e","C:\\ProgramData\\jupyter\\kernels\\apache_toree_scala\\bin\\run.sh", 
        "--profile", 
        "{connection_file}"
      ]
    }
    

    运行jupyter时,内核停止并显示错误:

    TypeError: environment can only contain strings
    

    扩展日志:

    [E 10:45:56.736 NotebookApp] Failed to run command:
        ['C:\\cygwin64\\bin\\mintty.exe', '-h', 'always', '/bin/bash', '-l', '-e', 'C:\\ProgramData\\jupyter\\kernels\\apache_toree_scala\\bin\\run.sh', '
    --profile', 'C:\\Users\\luis\\AppData\\Roaming\\jupyter\\runtime\\kernel-e02cac9b-15de-4c69-a8e5-e5b11919e1bc.json']
        with kwargs:
        {'stdin': -1, 'stdout': None, 'cwd': 'C:\\Users\\luis\\Documents', 'stderr': None, 'env': {'TMP': 'C:\\Users\\luis\\AppData\\Local\\Temp', 'COMPUTERNAME': 'laptop', 'USERDOMAIN': 'HOME', 'SPARK_HOME': u'C:\\CDH\\spark', 'DEFLOGDIR': 'C:\\ProgramData\\McAfee\\DesktopProtection', 'PSMODULEPATH': 'C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\Modules\\', 'COMMONPROGRAMFILES': 'C:\\Program Files\\Common Files', 'PROCESSOR_IDENTIFIER':'Intel64 Family 6 Model 45 Stepping 7, GenuineIntel', u'DEFAULT_INTERPRETER': u'Scala', 'PROGRAMFILES': 'C:\\Program Files', 'PROCESSOR_REVISION': '2d07', 'SYSTEMROOT': 'C:\\Windows', 'PATH': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Scripts;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin;C:\\Program Files\\Java\\jdk1.7.0_76\\jre\\bin;C:\\Windows\\system32;C:\\Windows;C:\\Windows\\System32\\Wbem;C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\;C:\\Program Files (x86)\\sbt\\bin;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Scripts;C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\Library\\bin', 'PROGRAMFILES(X86)': 'C:\\Program Files (x86)', 'WINDOWS_TRACING_FLAGS': '3', 'TK_LIBRARY': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\tcl\\tk8.5', u'__TOREE_SPARK_OPTS__': u'', 'TEMP': 'C:\\Users\\luis\\AppData\\Local\\Temp', 'COMMONPROGRAMFILES(X86)': 'C:\\Program Files (x86)\\Common Files', 'PROCESSOR_ARCHITECTURE': 'AMD64', 'TIX_LIBRARY': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\tcl\\tix8.4.3', 'ALLUSERSPROFILE': 'C:\\ProgramData', 'LOCALAPPDATA': 'C:\\Users\\luis\\AppData\\Local', 'HOMEPATH': '\\Users\\luis', 'JAVA_HOME': 'C:\\Program Files\\java\\jdk1.7.0_76', 'JPY_INTERRUPT_EVENT': '1056', 'PROGRAMW6432': 'C:\\Program Files', 'USERNAME': 'luis', 'LOGONSERVER': '\\\\S8KROGR2', 'SBT_HOME': 'C:\\Program Files (x86)\\sbt\\', 'JPY_PARENT_PID': '1036', 'PROGRAMDATA': 'C:\\ProgramData', u'PYTHONPATH': u'C:\\CDH\\spark\\python:C:\\CDH\\spark\\python\\lib\\py4j-0.8.2.1-src.zip', 'TCL_LIBRARY': 'C:\\Users\\luis\\AppData\\Local\\Continuum\\Anaconda2\\tcl\\tcl8.5', 'VSEDEFLOGDIR': 'C:\\ProgramData\\McAfee\\DesktopProtection', 'USERDNSDOMAIN': 'HOME.ES', 'SESSIONNAME': 'RDP-Tcp#0', 'PATHEXT': '.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC', u'PYTHON_EXEC': u'python', 'CLIENTNAME': 'laptop2', u'__TOREE_OPTS__': u'', 'FP_NO_HOST_CHECK': 'NO', 'WINDIR': 'C:\\Windows', 'WINDOWS_TRACING_LOGFILE': 'C:\\BVTBin\\Tests\\installpackage\\csilogfile.log', 'HOMEDRIVE': 'C:', 'SYSTEMDRIVE': 'C:', 'COMSPEC': 'C:\\Windows\\system32\\cmd.exe', 'NUMBER_OF_PROCESSORS': '2', 'APPDATA': 'C:\\Users\\luis\\AppData\\Roaming', 'PROCESSOR_LEVEL': '6', 'COMMONPROGRAMW6432':    'C:\\Program Files\\Common Files', 'OS': 'Windows_NT', 'PUBLIC': 'C:\\Users\\Public', 'IPY_INTERRUPT_EVENT': '1056', 'USERPROFILE': 'C:\\Users\\luis'}}
    
    [E 10:45:56.744 NotebookApp] Unhandled error in API request
        Traceback (most recent call last):
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\base\handlers.py", line 457, in wrapper
            result = yield gen.maybe_future(method(self, *args, **kwargs))
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
            value = future.result()
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
            raise_exc_info(self._exc_info)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1014, in run
            yielded = self.gen.throw(*exc_info)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\sessions\handlers.py", line 62, in post
            kernel_id=kernel_id))
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
            value = future.result()
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
            raise_exc_info(self._exc_info)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1014, in run
            yielded = self.gen.throw(*exc_info)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\sessions\sessionmanager.py", line 79, in create_session
            kernel_name)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
            value = future.result()
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
            raise_exc_info(self._exc_info)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1014, in run
            yielded = self.gen.throw(*exc_info)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\sessions\sessionmanager.py", line 92, in start_kernel_for_session
            self.kernel_manager.start_kernel(path=kernel_path, kernel_name=kernel_name)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 1008, in run
            value = future.result()
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\concurrent.py", line 232, in result
            raise_exc_info(self._exc_info)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\tornado\gen.py", line 282, in wrapper
            yielded = next(result)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\notebook\services\kernels\kernelmanager.py", line 87, in start_kernel
            super(MappingKernelManager, self).start_kernel(**kwargs)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\multikernelmanager.py", line 110, in start_kernel
            km.start_kernel(**kwargs)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\manager.py", line 243, in start_kernel
            **kw)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\manager.py", line 189, in _launch_kernel
            return launch_kernel(kernel_cmd, **kw)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\site-packages\jupyter_client\launcher.py", line 123, in launch_kernel
            proc = Popen(cmd, **kwargs)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\subprocess.py", line 711, in __init__
            errread, errwrite)
          File "C:\Users\luis\AppData\Local\Continuum\Anaconda2\lib\subprocess.py", line 959, in _execute_child
            startupinfo)
        TypeError: environment can only contain strings
    [E 10:45:56.766 NotebookApp] {
          "Origin": "http://localhost:8888",
          "Content-Length": "88",
          "Accept-Language": "es-ES,es;q=0.8",
          "Accept-Encoding": "gzip, deflate",
          "Host": "localhost:8888",
          "Accept": "application/json, text/javascript, */*; q=0.01",
          "User-Agent": "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.101 Safari/537.36",
          "Connection": "keep-alive",
          "X-Requested-With": "XMLHttpRequest",
          "Referer": "http://localhost:8888/notebooks/Untitled3.ipynb?kernel_name=apache_toree_scala",
          "Content-Type": "application/json"
        }
    [E 10:45:56.796 NotebookApp] 500 POST /api/sessions (::1) 626.00ms referer=http://localhost:8888/notebooks/Untitled3.ipynb?kernel_name=apache_toree_sc
    ala
    

    我运行了命令isolated:

    C:\\cygwin64\\bin\\mintty.exe -h always /bin/bash -l -e C:\\ProgramData\\jupyter\\kernels\\apache_toree_scala\\bin\\run.sh
    

    它有效。它只在jupyter服务器执行的上下文中失败。

    有没有人成功在Window机器上运行这个内核?

2 个答案:

答案 0 :(得分:2)

我编写了自己的(hacky)run.cmd并设法使用Spark 2.2.0和toree-assembly-0.2.0.dev1-incubating-SNAPSHOT使其工作。我在TOREE-399 ticket上发布了我的解决方案。

Run.cmd如下:

@echo off

set PROG_HOME=%~dp0..

if not defined SPARK_HOME (
  echo SPARK_HOME must be set to the location of a Spark distribution!
  exit 1
)

REM disable randomized hash for string in Python 3.3+
set PYTHONHASHSEED=0

REM The SPARK_OPTS values during installation are stored in __TOREE_SPARK_OPTS__. This allows values to be specified during
REM install, but also during runtime. The runtime options take precedence over the install options.

if not defined SPARK_OPTS (
  set SPARK_OPTS=%__TOREE_SPARK_OPTS__%
) else (
  if "%SPARK_OPTS%" == "" (
    set SPARK_OPTS=%__TOREE_SPARK_OPTS__%
  )
)

if not defined TOREE_OPTS (
  set TOREE_OPTS=%__TOREE_OPTS__%
) else (
  if "%TOREE_OPTS%" == "" (
    set TOREE_OPTS=%__TOREE_OPTS__%
  )
)

echo Starting Spark Kernel with SPARK_HOME=%SPARK_HOME%

REM This doesn't work because the classpath doesn't get set properly, unless you hardcode it in SPARK_SUBMIT_OPTS using forward slashes or double backslashes, but then you can't use the SPARK_HOME and PROG_HOME variables.
REM set SPARK_SUBMIT_OPTS=-cp "%SPARK_HOME%\conf\;%SPARK_HOME%\jars\*;%PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar" -Dscala.usejavacp=true
REM set TOREE_COMMAND="%SPARK_HOME%\bin\spark-submit.cmd" %SPARK_OPTS% --class org.apache.toree.Main %PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar %TOREE_OPTS% %*

REM The two important things that we must do differently on Windows are that we must add toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar to the classpath, and we must define the java property scala.usejavacp=true.
set TOREE_COMMAND="%JAVA_HOME%\bin\java" -cp "%SPARK_HOME%\conf\;%SPARK_HOME%\jars\*;%PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar" -Dscala.usejavacp=true -Xmx1g org.apache.spark.deploy.SparkSubmit %SPARK_OPTS% --class org.apache.toree.Main %PROG_HOME%\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar %TOREE_OPTS% %*

echo.
echo %TOREE_COMMAND%
echo.

%TOREE_COMMAND%
  • 应该放入run.cmd文件 C:\ ProgramData \ jupyter \内核\ apache_toree_scala \ BIN \
  • 另外, 你需要在上面的文件夹中编辑kernel.json来进行更改 run.sh to run.cmd。
  • 如果您想允许安装额外的 在Toree内核中,您还应该编辑toreeapp.py以将run.sh更改为 run.cmd。
  • 我没有测试过IF语句是否正常工作。我怀疑他们会扼杀一些参数,因为批次缺乏强大的IF声明。

答案 1 :(得分:0)

我有同样的问题使用:win on win 10,Anaconda,spark 2.0.2

这是我提出的解决方法:使用conda环境切换来设置环境变量而不是使用jupyter来设置它们

  1. 克隆一个单独的conda环境 (http://conda.pydata.org/docs/using/envs.html#create-an-environment
  2. 设置toree所需的环境变量 (http://conda.pydata.org/docs/using/envs.html#windows
  3. 然后跑 jupyter notebook <root dir of notebooks>

    您可以选择永久地从窗口设置toree变量并清空&#34; env&#34; kernel.json的字典 "env":{}

    我想在使用pyspark时保持我的env变量分开