Apache Spark,创建hive上下文 - NoSuchMethodException

时间:2016-03-10 15:24:04

标签: java exception apache-spark hive

我有以下问题,我的主要方法是:

static public void main(String args[]){
     SparkConf conf = new SparkConf().setAppName("TestHive");
     SparkContext sc = new org.apache.spark.SparkContext(conf);
     HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc);    
}

我用mvn package构建它 然后我提交我的代码,但是我得到以下异常。我不知道出了什么问题:

sh spark-submit --class "TestHive" --master local[4] ~/target/test-1.0-SNAPSHOT-jar-with-dependencies.jar 

Exception in thread "main" java.lang.NoSuchMethodException: org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars, java.util.concurrent.TimeUnit)

请告诉我,我错了。

PS我用hive和thriftServer建立了我的火花。

Spark 1.5.2 built for Hadoop 2.4.0
Build flags: -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn

1 个答案:

答案 0 :(得分:0)

似乎是spark组件之间的版本冲突(spark-core,spark-sql和spark-hive)

为了避免这种混乱,这些组件的所有版本应该是相同的。您可以在pom.xml中通过设置名为spark.version的peroperty来执行此操作,例如:

<properties>
    <spark.version>1.6.0</spark.version>
</properties>
<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
</dependencies>