我写了一个程序来扫描HBase
到Spark
的地方,我的初始环境是:
我最初的build.sbt
文件是:
name := "hbasetest"
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.3.1",
"org.apache.spark" % "spark-sql_2.11" % "2.3.1",
"org.apache.spark" % "spark-streaming_2.11" % "2.3.1",
"org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.3.1",
"org.apache.spark" % "spark-yarn_2.11" % "2.3.1",
"org.apache.hadoop" % "hadoop-core" % "2.6.0-mr1-cdh5.15.1",
"org.apache.hadoop" % "hadoop-common" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapred" % "0.22.0",
"org.apache.hadoop" % "hadoop-client" % "2.7.2",
"org.apache.hadoop" % "hadoop-nfs" % "2.7.2",
"org.apache.hadoop" % "hadoop-hdfs" % "2.7.2",
"org.apache.hadoop" % "hadoop-hdfs-nfs" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce-client" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce-client-common" % "2.7.2",
"org.apache.hbase" % "hbase" % "2.1.0",
"org.apache.hbase" % "hbase-server" % "2.1.0",
"org.apache.hbase" % "hbase-common" % "2.1.0",
"org.apache.hbase" % "hbase-client" % "2.1.0",
"org.apache.hbase" % "hbase-protocol" % "2.1.0",
"org.apache.hbase" % "hbase-metrics" % "2.1.0",
"org.apache.hbase" % "hbase-metrics-api" % "2.1.0",
"org.apache.hbase" % "hbase-mapreduce" % "2.1.0",
"org.apache.hbase" % "hbase-zookeeper" % "2.1.0",
"org.apache.hbase" % "hbase-hadoop-compat" % "2.1.0",
"org.apache.hbase" % "hbase-hadoop2-compat" % "2.1.0",
"org.apache.hbase" % "hbase-spark" % "2.1.0-cdh6.1.0",
"com.fasterxml.jackson.core" % "jackson-core" % "2.9.2",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.2"
)
resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"
resolvers += "Hortonworks Repository" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "clojars" at "https://clojars.org/repo"
resolvers += "conjars" at "http://conjars.org/repo"
resolvers += "Apache HBase" at "https://repository.apache.org/content/repositories/releases"
但是现在我需要将HBase 2.1.0
迁移到HBase 1.2.6
,但出现异常:
Exception in thread "main" org.apache.hadoop.hbase.DoNotRetryIOException: /10.251.12.73:16020 is unable to read call parameter from client 10.74.155.52; java.lang.UnsupportedOperationException: GetRegionLoad
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.instantiateException(RemoteWithExtrasException.java:100)
at org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.unwrapRemoteException(RemoteWithExtrasException.java:90)
我发现有人有相同的问题here,然后我回答了,更改了"org.apache.hbase" % "hbase-spark" % "2.1.0-cdh6.1.0"
到
"org.apache.hbase" % "hbase-spark" % "1.2.0-cdh5.16.0"
但是我仍然有同样的例外,并且如果我像这样修改我的build.sbt
:
name := "hbasetest"
version := "0.1"
scalaVersion := "2.11.12"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.3.1",
"org.apache.spark" % "spark-sql_2.11" % "2.3.1",
"org.apache.spark" % "spark-streaming_2.11" % "2.3.1",
"org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.3.1",
"org.apache.spark" % "spark-yarn_2.11" % "2.3.1",
"org.apache.hadoop" % "hadoop-core" % "2.6.0-mr1-cdh5.15.1",
"org.apache.hadoop" % "hadoop-common" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapred" % "0.22.0",
"org.apache.hadoop" % "hadoop-client" % "2.7.2",
"org.apache.hadoop" % "hadoop-nfs" % "2.7.2",
"org.apache.hadoop" % "hadoop-hdfs" % "2.7.2",
"org.apache.hadoop" % "hadoop-hdfs-nfs" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce-client" % "2.7.2",
"org.apache.hadoop" % "hadoop-mapreduce-client-common" % "2.7.2",
"org.apache.hbase" % "hbase" % "1.2.6",
"org.apache.hbase" % "hbase-server" % "1.2.6",
"org.apache.hbase" % "hbase-common" % "1.2.6",
"org.apache.hbase" % "hbase-client" % "1.2.6",
"org.apache.hbase" % "hbase-protocol" % "1.2.6",
"org.apache.hbase" % "hbase-metrics" % "2.1.0",
"org.apache.hbase" % "hbase-metrics-api" % "2.1.0",
"org.apache.hbase" % "hbase-mapreduce" % "2.1.0",
"org.apache.hbase" % "hbase-zookeeper" % "2.1.0",
"org.apache.hbase" % "hbase-hadoop-compat" % "1.2.6",
"org.apache.hbase" % "hbase-hadoop2-compat" % "1.2.6",
"org.apache.hbase" % "hbase-spark" % "1.2.0-cdh5.16.0",
"com.fasterxml.jackson.core" % "jackson-core" % "2.9.2",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.9.2"
)
resolvers += "Cloudera Repository" at "https://repository.cloudera.com/artifactory/cloudera-repos/"
resolvers += "Hortonworks Repository" at "http://repo.hortonworks.com/content/repositories/releases/"
resolvers += "clojars" at "https://clojars.org/repo"
resolvers += "conjars" at "http://conjars.org/repo"
resolvers += "Apache HBase" at "https://repository.apache.org/content/repositories/releases"
我有更多的异常,它无法编译成功,错误如下:
Class org.apache.hadoop.hbase.exceptions.DeserializationException not found - continuing with a stub.
Class org.apache.hadoop.hbase.filter.ByteArrayComparable not found - continuing with a stub.
你能帮我吗?谢谢!