我在ubuntu中安装了spark-2.1.0-bin-hadoop2.7.tgz。
我将zeppelin-env.sh设置如下。
export PYTHONPATH=/usr/bin/python
export PYSPARK_PYTHON=/home/jin/spark/python
所以我尝试在zeppelin笔记本中使用pyspark。
%spark.pyspark
print(2+2)
zeppelin笔记本中发生错误。
java.lang.NullPointerException
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext_2(SparkInterpreter.java:380)
at org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:369)
at org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:144)
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:817)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.spark.PySparkInterpreter.getSparkInterpreter(PySparkInterpreter.java:546)
at org.apache.zeppelin.spark.PySparkInterpreter.createGatewayServerAndStartScript(PySparkInterpreter.java:206)
at org.apache.zeppelin.spark.PySparkInterpreter.open(PySparkInterpreter.java:160)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:482)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
我不知道为什么会发生这些错误。
你能给我一些建议吗?
答案 0 :(得分:0)
你可以在没有Spark环境的情况下使用Jupiter吗?
同样可以在没有木星环境的情况下使用Spark吗?