我有以下工具: 的 Hadoop的2.6.0, 蜂房0.14.0, HBase的-0.94.8, sqoop-1.4.5, pig-0.14.0 安装在 Ubuntu 14.0.4上的伪分布式环境中。
我的目标是使用Hcatalog作为接口来处理Hive,Pig,MapReduce应用程序。
我做的步骤: 1.我将Mysql配置为远程Metastore,在HIVE_HOME / lib中复制了mysql-connector-java-5.1.37 jar。我在HIVE_HOME / conf中为远程Metastore创建了hive-site.xml,但是在同一台机器上运行。 2.我有hive-env.sh文件,HADOOP_HOME指向Hadoop-2.6.0 home。 我在端口9083上运行远程Metastore 4.在bashrc文件中我有以下env变量集:
#Hadoop variables start
export JAVA_HOME=/usr/lib/jvm/java-7-oracle
export HADOOP_HOME=/home/user/hadoop-2.6.0
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
#Hadoop variables end
export HADOOP_USER_CLASSPATH_FIRST=true
export PIG_USER_CLASSPATH_FIRST=true
#PIG ENV VARIABLE
export PIG_HOME=/home/user/pig-0.14.0
export PATH=$PATH:$PIG_HOME/bin
#Hive Env Variable
export HIVE_HOME=/home/user/hive-0.14.0/apache-hive-0.14.0-bin
export PATH=$PATH:$HIVE_HOME/bin
#HCatalog env
export HCAT_HOME=$HIVE_HOME/hcatalog
export HCAT_HOME
export PATH=$PATH:$HCAT_HOME/bin
HCATJAR=$HCAT_HOME/share/hacatalog/hive-hcatalog-core-0.14.0.jar
export HCATJAR
HCATPIGJAR=$HCAT_HOME/share/hcatalog/hive-hcatalog-pig-adapter-0.14.0.jar
export HCATPIGJAR
export HADOOP_CLASSPATH=$HCATJAR:$HCATPIGJAR:$HIVE_HOME/lib/hive-exec-0.14.0.jar\
:$HIVE_HOME/lib/hive-metastore-0.14.0.jar:$HIVE_HOME/lib/jdo-api-*.jar:$HIVE_HOME/lib/libfb303-*.jar\
:$HIVE_HOME/lib/libthrift-*.jar:$HIVE_HOME/conf:$HADOOP_HOME/etc/hadoop
#Pig hcatalog integration
export PIG_OPTS=-Dhive.metastore.uris=thrift://localhost:9083
export PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/*:$HIVE_HOME/lib/*:$HCATPIGJAR:$HIVE_HOME/conf:$HADOOP_HOME/etc/hadoop
我正在尝试调用" hcat"对HIVE_HOME / hcatalog / bin路径的命令。以下是正在生成的错误:
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:444)
at org.apache.hive.hcatalog.cli.HCatCli.main(HCatCli.java:149)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
**Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4**
at org.apache.hadoop.ipc.Client.call(Client.java:1070)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
at com.sun.proxy.$Proxy5.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:427)
... 6 more
观察:经过Google搜索后,如果我理解正确&#34;服务器IPC版本9无法与客户端版本4&#34;是hadoop版本不匹配。 所以我在hive-env.sh中添加了HADOOP_HOME来引用hadoop-2.6.0。 错误仍然存在。我不确定我错过了什么。对此的任何帮助都将非常感激。