齐柏林飞奔但过程已经死亡

时间:2016-11-13 13:21:42

标签: maven hadoop apache-spark apache-zeppelin

下午好。

我最近在Zeppelin遇到了一些麻烦。这是我第一次尝试安装它,过去一周我一直在使用它并没有成功。任何帮助或建议都将非常受欢迎。

作为背景信息,我的操作系统是CentOS 7,在我的集群上,我在Hadoop 2.7.2,Hive 2.1.0和HBase 1.2.4上运行Spark 2.0.1。此外,安装的其他产品是Anaconda2 4.2.0,Scala 2.11.8,R 3.3.1和maven 3.3.9。我的.bash_profile如下:

# added by Anaconda2 4.2.0 installer
export PATH="/opt/hadoop/anaconda2/bin:$PATH"

## JAVA env variables
export PATH=$PATH:$JAVA_HOME/bin
export CLASSPATH=.:$JAVA_HOME/jre/lib:$JAVA_HOME/lib:$JAVA_HOME/lib/tools.jar

## HADOOP env variables
export HADOOP_HOME=/opt/hadoop
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_CONF_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

## HBASE env variables
export HBASE_HOME=$HADOOP_HOME/hbase-current
export HBASE_PID_DIR=$HADOOP_HOME/hbase-current/pids
export PATH=$HBASE_HOME/bin:$PATH

## SPARK env variables
export SPARK_HOME=$HADOOP_HOME/spark-current
export PATH=$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH

## HIVE env variables
export HIVE_HOME=$HADOOP_HOME/hive-current
export PATH=$PATH:$HIVE_HOME/bin
export CLASSPATH=$CLASSPATH:$HADOOP_HOME/lib/native/*:.
export CLASSPATH=$CLASSPATH:$HIVE_HOME/lib/*:.

## SCALA env variables
export SCALA_HOME=/usr/bin/scala

## RHadoop env variables
export HADOOP_CMD=/opt/hadoop/bin/hadoop
export HADOOP_STREAMING=/opt/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.7.2.jar
export PKG_CONFIG_PATH=$PKG_CONFIG_PATH:/opt/hadoop/thrift-0.8.0/lib/cpp/.libs:/usr/local/lib/pkgconfig

## Spark-Notebook env variables
export PATH=$PATH:$HADOOP_HOME/spark-notebook-current/bin

## ZEPPELIN env variables
export ZEPPELIN_HOME=$HADOOP_HOME/zeppelin
export PATH=$PATH:$ZEPPELIN_HOME/bin

## MAVEN env variables
export M2_HOME=/opt/hadoop/maven
export MAVEN_HOME=/opt/hadoop/maven
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=1024m"
export PATH=$PATH:$MAVEN_HOME/bin

根据我的研究,我按如下方式安装了zeppelin:

git clone https://github.com/apache/zeppelin.git
./dev/change_scala_version.sh 2.11
mvn clean package -Pspark-2.0 -Dspark.version=2.0.1 -Phadoop-2.7 -Dhadoop.version=2.7.2 -Pyarn -Ppyspark -Psparkr -Pr -Pscala-2.11 -DskipTests

导致:

[INFO] Reactor Summary:
[INFO]
[INFO] Zeppelin ........................................... SUCCESS [  9.615 s]
[INFO] Zeppelin: Interpreter .............................. SUCCESS [ 20.536 s]
[INFO] Zeppelin: Zengine .................................. SUCCESS [ 20.312 s]
[INFO] Zeppelin: Display system apis ...................... SUCCESS [ 37.047 s]
[INFO] Zeppelin: Spark dependencies ....................... SUCCESS [01:51 min]
[INFO] Zeppelin: Spark .................................... SUCCESS [01:15 min]
[INFO] Zeppelin: Markdown interpreter ..................... SUCCESS [  1.860 s]
[INFO] Zeppelin: Angular interpreter ...................... SUCCESS [  0.646 s]
[INFO] Zeppelin: Shell interpreter ........................ SUCCESS [  0.621 s]
[INFO] Zeppelin: Livy interpreter ......................... SUCCESS [ 25.541 s]
[INFO] Zeppelin: HBase interpreter ........................ SUCCESS [ 11.993 s]
[INFO] Zeppelin: Apache Pig Interpreter ................... SUCCESS [ 10.638 s]
[INFO] Zeppelin: PostgreSQL interpreter ................... SUCCESS [  9.383 s]
[INFO] Zeppelin: JDBC interpreter ......................... SUCCESS [  4.049 s]
[INFO] Zeppelin: File System Interpreters ................. SUCCESS [  2.293 s]
[INFO] Zeppelin: Flink .................................... SUCCESS [ 19.473 s]
[INFO] Zeppelin: Apache Ignite interpreter ................ SUCCESS [  3.967 s]
[INFO] Zeppelin: Kylin interpreter ........................ SUCCESS [  1.507 s]
[INFO] Zeppelin: Python interpreter ....................... SUCCESS [  0.963 s]
[INFO] Zeppelin: Lens interpreter ......................... SUCCESS [  7.390 s]
[INFO] Zeppelin: Apache Cassandra interpreter ............. SUCCESS [01:31 min]
[INFO] Zeppelin: Elasticsearch interpreter ................ SUCCESS [  7.759 s]
[INFO] Zeppelin: BigQuery interpreter ..................... SUCCESS [  3.033 s]
[INFO] Zeppelin: Alluxio interpreter ...................... SUCCESS [  9.319 s]
[INFO] Zeppelin: web Application .......................... SUCCESS [08:56 min]
[INFO] Zeppelin: Server ................................... SUCCESS [ 50.740 s]
[INFO] Zeppelin: Packaging distribution ................... SUCCESS [  3.289 s]
[INFO] Zeppelin: R Interpreter ............................ SUCCESS [01:33 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:32 min
[INFO] Finished at: 2016-11-13T12:11:25+00:00
[INFO] Final Memory: 233M/921M
[INFO] ------------------------------------------------------------------------

现在,我将/ conf中的zeppelin-env.sh文件修改为

export MASTER=spark://master.Home:7077
export ZEPPELIN_PORT=9080
export ZEPPELIN_LOG_DIR=/opt/hadoop/zeppelin/logs
export ZEPPELIN_NOTEBOOK_DIR=/opt/hadoop/zeppelin/notebook              # Where notebook saved
export SPARK_HOME=/opt/hadoop/spark-current
export HADOOP_CONF_DIR=/opt/hadoop/etc/hadoop
export PYSPARK_PYTHON=$PYSPARK_PYTHON:/opt/hadoop/anaconda2/bin/python                  # path to the python command. must be the same path on the driver(Zeppelin) and all workers.
export PYTHONPATH=$PYTHONPATH:/opt/hadoop/anaconda2/lib/python2.7:/opt/hadoop/spark-current/python/lib/py4j-0.10.3-src.zip
export HBASE_HOME=/opt/hadoop/hbase_current

请注意,我必须将端口更改为9080,因为zeppelin默认端口号8080也是Spark 2.x的端口号。

现在,我在/ bin中使用zeppelin-daemon.sh start启动zeppelin并查询其返回状态:

Zeppelin running but process is dead                       [FAILED]

我转到http://localhost:9080,页面空白。

我的日志文件如下:

Zeppelin is restarting
ZEPPELIN_CLASSPATH: :.:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-1.b15.el7_2.x86_64/jre/lib:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-1.b15.el7_2.x86_64/lib:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.111-1.b15.el7_2.x86_64/lib/tools.jar:/opt/hadoop/lib/native/*:.:/opt/hadoop/hive-current/lib/*:.:/opt/hadoop/zeppelin/zeppelin-server/target/lib/*:/opt/hadoop/zeppelin/zeppelin-zengine/target/lib/*:/opt/hadoop/zeppelin/zeppelin-interpreter/target/lib/*:/opt/hadoop/zeppelin/*::/opt/hadoop/zeppelin/conf:/opt/hadoop/zeppelin/zeppelin-interpreter/target/classes:/opt/hadoop/zeppelin/zeppelin-zengine/target/classes:/opt/hadoop/zeppelin/zeppelin-server/target/classes
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hadoop/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-server/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-server/target/lib/zeppelin-interpreter-0.7.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-zengine/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-zengine/target/lib/zeppelin-interpreter-0.7.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop/zeppelin/zeppelin-interpreter/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Exception in thread "main" java.lang.IncompatibleClassChangeError: Implementing class
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.eclipse.jetty.server.ServerConnector.<init>(ServerConnector.java:96)
    at org.apache.zeppelin.server.ZeppelinServer.setupJettyServer(ZeppelinServer.java:207)
    at org.apache.zeppelin.server.ZeppelinServer.main(ZeppelinServer.java:128)

知道为什么会这样吗?任何帮助和/或建议都非常感谢。我忘了提到Hadoop和Spark总是在背景中运行。

此致

基督教

0 个答案:

没有答案