当我在Zeppelin中执行简单的sparkr脚本时:
代码%spark.r 2 + 2
我发现以下错误:
org.apache.zeppelin.interpreter.InterpreterException: sparkr is not responding
R version 3.0.2 (2013-09-25) -- "Frisbee Sailing"
Copyright (C) 2013 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.
Natural language support but running in an English locale
R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.
Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.
at org.apache.zeppelin.spark.ZeppelinR.open(ZeppelinR.java:165)
at org.apache.zeppelin.spark.SparkRInterpreter.open(SparkRInterpreter.java:91)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:618)
at org.apache.zeppelin.scheduler.Job.run(Job.java:186)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
即使我在zeppline.env.sh中设置了SPARK_HOME和JAVA_HOME
答案 0 :(得分:0)
很可能R进程已经死了,请查看spark解释器日志