我正在尝试解决以下错误:
13/05/05 19:49:04 INFO handler.OpenRegionHandler: Opening of region {NAME => '-ROOT-,,0', STARTKEY => '', ENDKEY => '', ENCODED => 70236052,} failed, marking as FAILED_OPEN in ZK
13/05/05 19:49:04 INFO regionserver.HRegionServer: Received request to open region: -ROOT-,,0.70236052
13/05/05 19:49:04 INFO regionserver.HRegion: Setting up tabledescriptor config now ...
13/05/05 19:49:04 ERROR handler.OpenRegionHandler: Failed open of region=-ROOT-,,0.70236052, starting to roll back the global memstore size.
java.lang.IllegalStateException: Could not instantiate a region instance.
at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:3747)
at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:3927)
at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:332)
at org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:108)
at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:175)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:680)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedConstructorAccessor17.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:3744)
... 7 more
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost
at org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:421)
... 11 more
我有以下Maven依赖:
<properties>
<hadoopCDHMRVersion>2.0.0-mr1-cdh4.2.0</hadoopCDHMRVersion>
<hadoopCDHVersion>2.0.0-cdh4.2.0</hadoopCDHVersion>
<hbaseCDHVersion><b>0.94.2-cdh4.2.0</b></hbaseCDHVersion>
</properties>
<dependencyManagement>
<dependencies>
<!-- Apache -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>${hadoopCDHMRVersion}</version>
<exclusions>
<exclusion>
<groupId>tomcat</groupId>
<artifactId>jasper-compiler</artifactId>
</exclusion>
<exclusion>
<groupId>tomcat</groupId>
<artifactId>jasper-runtime</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoopCDHVersion}</version>
<exclusions>
<exclusion>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
</exclusion>
<exclusion>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
<exclusion>
<groupId>tomcat</groupId>
<artifactId>jasper-compiler</artifactId>
</exclusion>
<exclusion>
<groupId>tomcat</groupId>
<artifactId>jasper-runtime</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty</artifactId>
</exclusion>
<exclusion>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoopCDHVersion}</version>
</dependency>
<!-- Test -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase</artifactId>
<scope>test</scope>
<classifier>tests</classifier>
<version>${hbaseCDHVersion}</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase</artifactId>
<scope>provided</scope>
<version>${hbaseCDHVersion}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-test</artifactId>
<version>${hadoopCDHMRVersion}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-minicluster</artifactId>
<version>${hadoopCDHMRVersion}</version>
<scope>test</scope>
</dependency>
<dependencies>
</dependencyManagement>
我将父pom的依赖带到了孩子pom。 我测试的代码:
//Started a mini cluster to perform unit test
final Configuration startingConf = HBaseConfiguration.create();
startingConf.setLong("hbase.client.keyvalue.maxsize", 65536);
startingConf.setStrings(HConstants.ZOOKEEPER_QUORUM, "localhost");
startingConf.setStrings("mapreduce.jobtracker.address", "local");
startingConf.setLong(HConstants.HBASE_CLIENT_PAUSE, 50);
startingConf.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER, 200);
testUtil = new HBaseTestingUtility(startingConf);
//point of failure
testUtil.startMiniCluster();
//Started a mini cluster to perform unit test
final Configuration startingConf = HBaseConfiguration.create();
startingConf.setLong("hbase.client.keyvalue.maxsize", 65536);
startingConf.setStrings(HConstants.ZOOKEEPER_QUORUM, "localhost");
startingConf.setStrings("mapreduce.jobtracker.address", "local");
startingConf.setLong(HConstants.HBASE_CLIENT_PAUSE, 50);
startingConf.setInt(HConstants.HBASE_CLIENT_RETRIES_NUMBER, 200);
testUtil = new HBaseTestingUtility(startingConf);
//point of failure
testUtil.startMiniCluster();
我在startMiniCluster()之后得到错误
它完成了实例化环境的大部分工作,但由于上述错误而介于两者之间。
我试过的事情:
非常感谢任何指针。
答案 0 :(得分:1)
问题在于commons-configuration jar。父pom带来了1.9版本,导致与带有1.6版本的hadoop普通jar冲突。找出问题的唯一方法是在父pom中保持minmum依赖关系,逐一取消注释依赖关系以缩小问题范围。一旦发现问题,只需在hadoop commons依赖项中排除这些依赖项。 希望这有助于某人。 hadoop jar应该升级那里现在五年前的公共配置。 我们还可以将最新的jar从1.9回滚到1.6