Hive表在Spark 2.1.1中不可见

时间:2017-06-01 21:29:23

标签: java hadoop apache-spark hive

我从Hive终端创建了一个数据库和一个表,并在其中加载了数据。但是,当我尝试从Spark访问该表时,我得到以下异常:

Exception in thread "main" org.apache.spark.sql.AnalysisException: Table or view not found: employee;

正如其他一些SO答案中所提到的,我已经将我的hive-site.xml放在了spark / conf目录中。

以下是我在hive-site.xml中添加的代码:

<property>
  <name>hive.exec.local.scratchdir</name>
  <value>/usr/local/hive/iotmp</value>
  <description>Local scratch space for Hive jobs</description>
</property>

<property>
  <name>hive.querylog.location</name>
  <value>/usr/local/hive/iotmp</value>
  <description>Location of Hive run time structured log file</description>
</property>

<property>
  <name>hive.downloaded.resources.dir</name>
  <value>/usr/local/hive/iotmp</value>
  <description>Temporary local directory for added resources in the remote file system.</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby://localhost:1527/metastore_db;create=true </value>
  <description>JDBC connect string for a JDBC metastore </description>
</property>

以下是我的Java代码:

import org.apache.spark.SparkConf;
import org.apache.spark.sql.SparkSession;
public class SparkHiveExample {
public static void main(String[] args) {

    SparkSession session = SparkSession.builder()
      .master("local")
      .appName("spark session example")
      .enableHiveSupport()
      .getOrCreate();
    System.out.println("show tables ---> "+session.sql("select * from employee").count());

 }
}

我使用以下命令运行:

spark-submit --class SparkHiveExample --master local[4] /Users/akanchan/Documents/workspace/testone/target/simple-project-1.0.jar

请帮我解决问题。

0 个答案:

没有答案