编译语句时出错:FAILED:ParseException行1:14无法识别输入' select' ' *' '从'在加入源

时间:2018-03-08 19:38:34

标签: scala apache-spark jdbc hive apache-spark-sql

我已经设置了Scala,我试图使用spark将数据加载到hive表中,而connect hive表我遇到以下错误

代码

val spark = SparkSession.builder().master("local[2]").appName("interfacing spark sql to hive metastore without configuration file")
      .config("hive.metastore.warehouse.dir", "C:\\xxxx\\xxxx\\xxxx\\")
      .enableHiveSupport() // don't forget to enable hive support
      .getOrCreate()

    val sc = spark.sparkContext
    val sqlContext = spark.sqlContext
    val driverName = "org.apache.hive.jdbc.HiveDriver"

    System.setProperty("javax.net.ssl.trustStore", "C:\\xxxx\\xxxx\\xxxx\\xxxx\\xxxx\\security\\jssecacerts")
    System.setProperty("java.security.krb5.debug","true")
    System.setProperty("java.security.krb5.conf",new File("C:\\xxxx\\xxxx\\krb5.conf").getAbsolutePath)
    System.setProperty("javax.security.auth.useSubjectCredsOnly", "false")
    System.setProperty("java.security.auth.login.config", new File("C:\\xxxx\\xxxx\\jaas.conf").getAbsolutePath)
    val hiveurl="jdbc:hive2://xxxxx.octorp.com:10000/devl_dkp;user=pcpdosr;password=Kopdevp1;ssl=true;AuthMech=3"
    //;mapred.job.queue.name=dkl"
    val connectionProperties = new java.util.Properties()

    sc.setLocalProperty("spark.scheduler.pool", "dkl")
    val hiveQuery = "select * from devl_dkp.employee"

    val hiveResult = spark.read.option("driver",driverName).jdbc(hiveurl, hiveQuery, connectionProperties).collect()

错误

exception caught: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: ParseException line 1:14 cannot recognize input near 'select' '*' 'from' in join source

感谢任何帮助

0 个答案:

没有答案