[I 10:43:53.627 NotebookApp] 启动notebooks 在本地路径: /opt/soft/recommender/jupyter
[I 10:43:53.627 NotebookApp] 本程序运行在: http://10.48.204.120:8888/
[I 10:43:53.627 NotebookApp] 使用control-c停止此服务器并关闭所有内核(两次跳过确认).
[W 10:43:53.628 NotebookApp] 没有找到web浏览器: could not locate runnable browser.
[I 10:44:11.697 NotebookApp] Kernel started: 7ea0717b-b85b-44b1-bd10-7a2079b24d94
[I 10:44:11.708 NotebookApp] 302 GET /notebooks/doc/source/images/als-diagram.png (10.252.183.252) 6.63ms
[I 10:44:14.691 NotebookApp] KernelRestarter: restarting kernel (1/5), new random ports
[I 10:44:17.719 NotebookApp] KernelRestarter: restarting kernel (2/5), new random ports
[I 10:44:20.746 NotebookApp] KernelRestarter: restarting kernel (3/5), new random ports
[I 10:44:23.774 NotebookApp] KernelRestarter: restarting kernel (4/5), new random ports
[W 10:44:26.800 NotebookApp] KernelRestarter: restart failed
[W 10:44:26.801 NotebookApp] Kernel 7ea0717b-b85b-44b1-bd10-7a2079b24d94 died, removing from map.
[W 10:45:11.836 NotebookApp] Timeout waiting for kernel_info reply from 7ea0717b-b85b-44b1-bd10-7a2079b24d94
[E 10:45:11.839 NotebookApp] Error opening stream: HTTP 404: Not Found (Kernel does not exist: 7ea0717b-b85b-44b1-bd10-7a2079b24d94)
[I 10:46:11.948 NotebookApp] Saving file at /notebooks/elasticsearch-spark-recommender.ipynb
我正在尝试使用以下命令通过pyspark启动jupyter笔记本:
PYSPARK_DRIVER_PYTHON="jupyter" PYSPARK_DRIVER_PYTHON_OPTS="notebook"
../spark-2.2.0-bin-hadoop2.7/bin/pyspark --driver-memory 4g --driver-class-path /opt/soft/recommender/spark/elasticsearch-hadoop-5.3.0/dist/elasticsearch-spark-20_2.11-5.3.0.jar
我可以通过远程浏览器打开jupyter,但是由于内核重启失败,因此无法通过jupyter运行python演示。
python版本是3.5.0。 Jupyter和ipykernel都是最新的。 prompt-toolkit
版本是1.0.15。
如果我将prompt-toolkit
版本更新为2.0.4,则启动pyspark会引发很多错误消息。
我该如何解决?
答案 0 :(得分:0)
I've updated it : pip install --upgrade prompt-toolkit==2.0.4 (old is 1.0.15)
pip install --upgrade ipython=7.0.1 (old is 6.5.0)
pip install --upgrade jupyter-console=6.0.0 (old is 5.2.0)
However, the above problems are not solved, and the above information remains unchanged
答案 1 :(得分:0)
我已经以其他方式修复它:重新安装python,重新安装pip和其他模块,例如jupyter,tmdbsimple,elasticsearch等。
但是我猜这个错误是因为内存不足,因为当我加载大量数据分析时,该异常信息将再次出现,我还试图分别部署es和spark并寻找其他解决方案
如果您有更好的避免诊断方法,请给我留言
答案 2 :(得分:0)
问题已完全解决。搜索后,在配置spark2.2.0环境变量时丢失了python的配置项,这导致内核重新启动失败。环境变量的配置如下:
export PYTHONPATH = $ SPARK_HOME / python /:$ PYTHONPATH
export PYTHONPATH = $ SPARK_HOME / python / lib / py4j-0.10.4-src.zip:$ PYTHONPATH
您还可以尝试重新安装“ sudo pip install py4j”和低版本的“ sudo pip install --upgrade numpy == 1.13.0”库。解决后,集成系统可以平稳运行并进行矩阵分析
答案 3 :(得分:0)
我用以下命令解决了问题:
std::unique_ptr<int, void(*)(int *,int)> get_conn(int handle)
//^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
{
// ...
}
答案 4 :(得分:0)
这篇文章解决了我的问题。 Jupyter-notebook reports an error KernelRestarter: restart failed or "kernel starting, please wait"
基本上,卸载文章中列出的所有软件包,然后重新安装 jupyter 和 notebook
pip uninstall -y ipykernel ipython jupyter_client jupyter_core traitlets ipython_genutils jupyter notebook tornado
pip install jupyter notebook