我使用Zeppelin使用crontab运行一些spark工作。 jdk8,spark1.6.2,scala2.10 但在最后几天,我发现Zeppelin的火花翻译几乎每天都会关闭一次或多次。 这是日志
java.lang.StackOverflowError
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4197)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$Type.asSeenFrom(Types.scala:754)
at scala.reflect.internal.Types$Type.computeMemberType(Types.scala:788)
at scala.reflect.internal.Symbols$MethodSymbol.typeAsMemberOf(Symbols.scala:2655)
at scala.reflect.internal.Types$Type.memberType(Types.scala:779)
at scala.reflect.internal.Types$class.defineUnderlyingOfSingleType(Types.scala:1534)
at scala.reflect.internal.SymbolTable.defineUnderlyingOfSingleType(SymbolTable.scala:13)
at scala.reflect.internal.Types$SingleType.underlying(Types.scala:1486)
at scala.reflect.internal.Types$SingletonType.widen(Types.scala:1340)
at scala.reflect.internal.Types$AsSeenFromMap.toPrefix$1(Types.scala:4541)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4556)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4197)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$Type.asSeenFrom(Types.scala:754)
at scala.reflect.internal.Types$Type.computeMemberType(Types.scala:788)
at scala.reflect.internal.Symbols$MethodSymbol.typeAsMemberOf(Symbols.scala:2655)
at scala.reflect.internal.Types$Type.memberType(Types.scala:779)
at scala.reflect.internal.Types$class.defineUnderlyingOfSingleType(Types.scala:1534)
at scala.reflect.internal.SymbolTable.defineUnderlyingOfSingleType(SymbolTable.scala:13)
at scala.reflect.internal.Types$SingleType.underlying(Types.scala:1486)
at scala.reflect.internal.Types$SingletonType.widen(Types.scala:1340)
at scala.reflect.internal.Types$AsSeenFromMap.toPrefix$1(Types.scala:4541)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4556)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4197)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4638)
at scala.reflect.internal.Types$Type.asSeenFrom(Types.scala:754)
at scala.reflect.internal.Types$Type.computeMemberType(Types.scala:788)
at scala.reflect.internal.Symbols$MethodSymbol.typeAsMemberOf(Symbols.scala:2655)
at scala.reflect.internal.Types$Type.memberType(Types.scala:779)
at scala.reflect.internal.Types$class.defineUnderlyingOfSingleType(Types.scala:1534)
at scala.reflect.internal.SymbolTable.defineUnderlyingOfSingleType(SymbolTable.scala:13)
at scala.reflect.internal.Types$SingleType.underlying(Types.scala:1486)
at scala.reflect.internal.Types$SingletonType.widen(Types.scala:1340)
at scala.reflect.internal.Types$AsSeenFromMap.toPrefix$1(Types.scala:4541)
at scala.reflect.internal.Types$AsSeenFromMap.apply(Types.scala:4556)
at scala.reflect.internal.Types$TypeMap.mapOver(Types.scala:4183)
........
这是一个非常长的日志,似乎是一个循环 我已经像这样调整了env
export JAVA_HOME=/home/hadoop/jdk
export MASTER=spark://namenode:7077
export ZEPPELIN_PORT=10001
export SPARK_HOME=/home/hadoop/spark-1.6.2-bin-hadoop2.6
export SPARK_SUBMIT_OPTIONS="--driver-memory 2g --executor-memory 5g --driver-class-path /home/hadoop/spark-1.6.2-bin-hadoop2.6/extlib/oracle-driver.jar:/home/hadoop/spark-1.6.2-bin-hadoop2.6/extlib/phoenix-4.7.0-HBase-1.1-client-spark.jar:/home/hadoop/spark-1.6.2-bin-hadoop2.6/extlib/spark-csv_2.10-1.3.0.jar:/home/hadoop/zeppelin-0.6.0-bin-all/lib/*:/home/hadoop/zeppelin-0.6.0-bin-all/*::/home/hadoop/zeppelin-0.6.0-bin-all/conf:/home/hadoop/zeppelin-0.6.0-bin-all/interpreter/spark/*::/home/hadoop/zeppelin-0.6.0-bin-all/conf:/home/hadoop/zeppelin-0.6.0-bin-all/conf:/home/hadoop/zeppelin-0.6.0-bin-all/lib/zeppelin-interpreter-0.6.0.jar/home/hadoop/zeppelin-0.6.0-bin-all/interpreter/spark/zeppelin-spark-0.6.0.jar"
export ZEPPELIN_MEM=-Xmx4096m
export ZEPPELIN_JAVA_OPTS="-Xmx4096m"
答案 0 :(得分:1)
我找到了原因。
首先,zeppelin使用scala 2.10.4和spark 1.6.2使用2.10.6 删除zepplin_home / lib中的scala lib并放入scala libariry 2.10.6
其次,重新启动spark解释器,否则所有任务都将在唯一的spark应用程序上执行。随着时间的推移,应用程序将因内存问题而关闭。只需添加一个cron noteboot并检查“auto-restart”关于cron执行的解释器“on
这项工作将关闭旧的火花解释器,并开始一个新的。