java.lang.ClassNotFoundException:org.apache.hive.service.cli.HiveSQLException

时间:2017-06-14 14:09:17

标签: java maven hadoop hive

为了通过spring应用程序连接hive,我在pom.xml中添加了以下依赖项。

<dependency>
 <groupId>org.apache.hive</groupId>
 <artifactId>hive-jdbc</artifactId>
 <version>2.1.1</version>
</dependency>
<dependency>
  <groupId>org.apache.thrift</groupId>
  <artifactId>libfb303</artifactId>
  <version>0.9.3</version>
    <exclusions>
     <exclusion>
    <groupId>org.apache.hive</groupId>
        <artifactId>hive-common</artifactId>
     </exclusion>
    </exclusions>
</dependency>
<dependency>
  <groupId>org.apache.hive</groupId>
  <artifactId>hive-service-rpc</artifactId>
  <version>2.1.1</version>
</dependency>
  

但我仍然面临以下异常::引起:   java.lang.NoClassDefFoundError:   org / apache / hive / service / cli / HiveSQLException at   org.apache.hive.jdbc.HiveConnection。(HiveConnection.java:132)at at   org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)at at   java.sql.DriverManager.getConnection(DriverManager.java:664)at   java.sql.DriverManager.getConnection(DriverManager.java:208)at   org.springframework.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriverManager(DriverManagerDataSource.java:173)   在   org.springframework.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriver(DriverManagerDataSource.java:164)   在   org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getConnectionFromDriver(AbstractDriverBasedDataSource.java:149)   在   org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getConnection(AbstractDriverBasedDataSource.java:119)   在   org.hibernate.ejb.connection.InjectedDataSourceConnectionProvider.getConnection(InjectedDataSourceConnectionProvider.java:71)   在   org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:113)   在   org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2836)   在   org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2832)   在   org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1843)   在   org.hibernate.ejb.Ejb3Configuration.buildEntityManagerFactory(Ejb3Configuration.java:906)   在   org.hibernate.ejb.HibernatePersistence.createContainerEntityManagerFactory(HibernatePersistence.java:74)   在   org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:268)   在   org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:310)   在   org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1514)   在   org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1452)   ... 97更多引起:java.lang.ClassNotFoundException:   org.apache.hive.service.cli.HiveSQLException at   org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1713)   在   org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1558)   ... 116更多

任何人请告诉我,如果我遗失任何罐子或问题是什么。使用相同的网址,我可以通过直线连接

 $HIVE_HOME/bin/beeline -u jdbc:hive2://localhost:10000/TEST_DB

我在下面提到了url,但它没有回答

org.apache.hive.service.cli.HiveSQLException: java.lang.NoClassDefFoundError: org/apache/hadoop/ipc/CallerContext$Builder

2 个答案:

答案 0 :(得分:1)

我认为你需要hive-service.jar。添加此项并验证其是否有效。

答案 1 :(得分:1)

终于能够通过spring应用程序连接hive。这种类型的异常,我问我们得到了由于与jar的兼容性问题,即某些版本与spring使用不兼容。我正在附加我的pom.xml以获得更多说明

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <artifactId>spring-hadoop-samples-hive</artifactId>

    <name>Spring Hadoop Samples - Hive</name>

    <parent>
        <groupId>org.springframework.samples</groupId>
        <artifactId>spring-hadoop-samples</artifactId>
        <version>1.0.0.BUILD-SNAPSHOT</version>
        <relativePath>../parent/pom.xml</relativePath>
    </parent>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <spring.hadoop.version>2.3.0.M1</spring.hadoop.version>
        <hadoop.version>2.7.1</hadoop.version>
        <hive.version>1.2.1</hive.version>
        <!-- <hive.version>2.1.1</hive.version> -->
    </properties>

    <dependencies>

        <dependency>
            <groupId>org.springframework.data</groupId>
            <artifactId>spring-data-hadoop</artifactId>
            <version>${spring.hadoop.version}</version>
            <exclusions>
                <exclusion>
                    <groupId>org.springframework</groupId>
                    <artifactId>spring-context-support</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-jdbc</artifactId>
            <version>${spring.version}</version>
        </dependency>

        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-test</artifactId>
            <version>${spring.version}</version>
        </dependency>

        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-tx</artifactId>
            <version>${spring.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
            <scope>compile</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-metastore</artifactId>
            <version>${hive.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-service</artifactId>
            <version>${hive.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.thrift</groupId>
            <artifactId>libfb303</artifactId>
            <version>0.9.1</version>
        </dependency>

        <!-- runtime Hive deps start -->
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-common</artifactId>
            <version>${hive.version}</version>
            <scope>runtime</scope>
        </dependency>


        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>${hive.version}</version>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-shims</artifactId>
            <version>${hive.version}</version>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-serde</artifactId>
            <version>${hive.version}</version>
            <scope>runtime</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-contrib</artifactId>
            <version>${hive.version}</version>
            <scope>runtime</scope>
        </dependency>

        <!-- runtime Hive deps end -->

        <dependency>
            <groupId>org.codehaus.groovy</groupId>
            <artifactId>groovy</artifactId>
            <version>1.8.5</version>
            <scope>runtime</scope>
        </dependency>

    </dependencies>

    <repositories>
        <repository>
            <id>spring-milestone</id>
            <url>http://repo.spring.io/libs-milestone</url>
        </repository>
    </repositories>

    <build>
        <plugins>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>appassembler-maven-plugin</artifactId>
                <version>1.2.2</version>
                <configuration>
                    <repositoryLayout>flat</repositoryLayout>
                    <configurationSourceDirectory>src/main/config</configurationSourceDirectory>
                    <copyConfigurationDirectory>true</copyConfigurationDirectory>
                    <!-- Extra JVM arguments that will be included in the bin scripts -->
                    <extraJvmArguments>-Xms512m -Xmx1024m -Dhive.version=${hive.version}</extraJvmArguments>
                    <programs>
                        <program>
                            <mainClass>org.springframework.samples.hadoop.hive.HiveApp</mainClass>
                            <name>hiveApp</name>
                        </program>
                        <program>
                            <mainClass>org.springframework.samples.hadoop.hive.HiveClientApp</mainClass>
                            <name>hiveClientApp</name>
                        </program>
                        <program>
                            <mainClass>org.springframework.samples.hadoop.hive.HiveAppWithApacheLogs</mainClass>
                            <name>hiveAppWithApacheLogs</name>
                        </program>
                    </programs>
                </configuration>
                <executions>
                    <execution>
                        <id>package</id>
                        <goals>
                            <goal>assemble</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-antrun-plugin</artifactId>
                <executions>
                    <execution>
                        <id>config</id>
                        <phase>package</phase>
                        <configuration>
                            <tasks>
                                <copy todir="target/appassembler/data">
                                    <fileset dir="data"/>
                                </copy>
                            </tasks>
                        </configuration>
                        <goals>
                            <goal>run</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

</project>

之后,您在运行appliation时可能会遇到身份验证问题。请执行以下命令。

hadoop fs -mkdir /tmp
hadoop fs -chmod a+w /tmp
hadoop fs -mkdir -p /user/hive/warehouse
hadoop fs -chmod a+w /user/hive/warehouse

如果它们没有文件夹结构,那么它将创建并给予读取,写入权限。