我在mySql中设置了hive Metastore,同样可以通过hive访问并创建数据库和表。如果我尝试通过spark-shell访问hive表,那么可以通过从mysql hive metastore获取正确的表信息。但是如果从eclipse执行,它不会从Mysql中获取。
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0
/_/
使用Scala版本2.11.8(OpenJDK 64-Bit Server VM
,Java 1.8.0_151
)
输入表达式以对其进行评估。
键入:帮助以获取更多信息。
scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
warning: there was one deprecation warning; re-run with -deprecation for details
sqlContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@2b9e69fb
scala> sqlContext.sql("show databases");
res0: org.apache.spark.sql.DataFrame = [databaseName: string]
但是如果我尝试通过eclipse访问,那么它不是指向MySql。而不是它是德比。请查找以下日志和hive-site.xml以获取想法。
注意: hive-site.xml在hive / conf和spark / conf路径中是相同的。
package events
import org.apache.spark.{SparkContext, SparkConf}
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql
object DataContent {
def main(args:Array[String])
{
val conf = new SparkConf()
conf.setAppName("Word Count2").setMaster("local")
val sc = new SparkContext(conf)
println("Hello to Spark World")
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
var query = sqlContext.sql("show databases");
query.collect()
println("Bye to Spark example2")
}
}
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/02/17 12:03:15 INFO SparkContext: Running Spark version 2.0.0
18/02/17 12:03:17 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/02/17 12:03:18 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.189.136 instead (on interface ens33)
18/02/17 12:03:18 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
18/02/17 12:03:18 INFO SecurityManager: Changing view acls to: vm4learning
18/02/17 12:03:18 INFO SecurityManager: Changing modify acls to: vm4learning
18/02/17 12:03:18 INFO SecurityManager: Changing view acls groups to:
18/02/17 12:03:18 INFO SecurityManager: Changing modify acls groups to:
18/02/17 12:03:18 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(vm4learning); groups with view permissions: Set(); users with modify permissions: Set(vm4learning); groups with modify permissions: Set()
18/02/17 12:03:20 INFO Utils: Successfully started service 'sparkDriver' on port 45637.
18/02/17 12:03:20 INFO SparkEnv: Registering MapOutputTracker
18/02/17 12:03:20 INFO SparkEnv: Registering BlockManagerMaster
18/02/17 12:03:20 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-37db59bd-c12a-4603-ba9f-8e8fec88cc29
18/02/17 12:03:20 INFO MemoryStore: MemoryStore started with capacity 881.4 MB
18/02/17 12:03:21 INFO SparkEnv: Registering OutputCommitCoordinator
18/02/17 12:03:23 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18/02/17 12:03:23 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.189.136:4040
18/02/17 12:03:23 INFO Executor: Starting executor ID driver on host localhost
18/02/17 12:03:23 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38313.
18/02/17 12:03:23 INFO NettyBlockTransferService: Server created on 192.168.189.136:38313
18/02/17 12:03:23 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.189.136, 38313)
18/02/17 12:03:23 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.189.136:38313 with 881.4 MB RAM, BlockManagerId(driver, 192.168.189.136, 38313)
18/02/17 12:03:23 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.189.136, 38313)
Hello to Spark World
18/02/17 12:03:30 INFO HiveSharedState: Warehouse path is 'file:/home/vm4learning/workspace/Acumen/spark-warehouse'.
18/02/17 12:03:30 INFO SparkSqlParser: Parsing command: show databases
18/02/17 12:03:32 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
18/02/17 12:03:34 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
18/02/17 12:03:34 INFO ObjectStore: ObjectStore, initialize called
18/02/17 12:03:36 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/02/17 12:03:36 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
18/02/17 12:03:41 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
18/02/17 12:03:47 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
18/02/17 12:03:47 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
18/02/17 12:03:49 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
18/02/17 12:03:49 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
18/02/17 12:03:49 INFO Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing
18/02/17 12:03:49 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
18/02/17 12:03:49 INFO ObjectStore: Initialized ObjectStore
18/02/17 12:03:50 INFO HiveMetaStore: Added admin role in metastore
18/02/17 12:03:50 INFO HiveMetaStore: Added public role in metastore
18/02/17 12:03:50 INFO HiveMetaStore: No user is added in admin role, since config is empty
18/02/17 12:03:51 INFO HiveMetaStore: 0: get_all_databases
18/02/17 12:03:51 INFO audit: ugi=vm4learning ip=unknown-ip-addr cmd=get_all_databases
18/02/17 12:03:51 INFO HiveMetaStore: 0: get_functions: db=default pat=*
18/02/17 12:03:51 INFO audit: ugi=vm4learning ip=unknown-ip-addr cmd=get_functions: db=default pat=*
18/02/17 12:03:51 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
18/02/17 12:03:52 INFO SessionState: Created local directory: /tmp/5c825a73-be72-4bd1-8bfc-966d8a095919_resources
18/02/17 12:03:52 INFO SessionState: Created HDFS directory: /tmp/hive/vm4learning/5c825a73-be72-4bd1-8bfc-966d8a095919
18/02/17 12:03:52 INFO SessionState: Created local directory: /tmp/vm4learning/5c825a73-be72-4bd1-8bfc-966d8a095919
18/02/17 12:03:52 INFO SessionState: Created HDFS directory: /tmp/hive/vm4learning/5c825a73-be72-4bd1-8bfc-966d8a095919/_tmp_space.db
18/02/17 12:03:52 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is file:/home/vm4learning/workspace/Acumen/spark-warehouse
18/02/17 12:03:52 INFO SessionState: Created local directory: /tmp/32b99842-2ac2-491e-934d-9726a6213c37_resources
18/02/17 12:03:52 INFO SessionState: Created HDFS directory: /tmp/hive/vm4learning/32b99842-2ac2-491e-934d-9726a6213c37
18/02/17 12:03:52 INFO SessionState: Created local directory: /tmp/vm4learning/32b99842-2ac2-491e-934d-9726a6213c37
18/02/17 12:03:52 INFO SessionState: Created HDFS directory: /tmp/hive/vm4learning/32b99842-2ac2-491e-934d-9726a6213c37/_tmp_space.db
18/02/17 12:03:52 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is file:/home/vm4learning/workspace/Acumen/spark-warehouse
18/02/17 12:03:53 INFO HiveMetaStore: 0: create_database: Database(name:default, description:default database, locationUri:file:/home/vm4learning/workspace/Acumen/spark-warehouse, parameters:{})
18/02/17 12:03:53 INFO audit: ugi=vm4learning ip=unknown-ip-addr cmd=create_database: Database(name:default, description:default database, locationUri:file:/home/vm4learning/workspace/Acumen/spark-warehouse, parameters:{})
18/02/17 12:03:55 INFO HiveMetaStore: 0: get_databases: *
18/02/17 12:03:55 INFO audit: ugi=vm4learning ip=unknown-ip-addr cmd=get_databases: *
18/02/17 12:03:56 INFO CodeGenerator: Code generated in 1037.20509 ms
Bye to Spark example2
18/02/17 12:03:56 INFO SparkContext: Invoking stop() from shutdown hook
18/02/17 12:03:57 INFO SparkUI: Stopped Spark web UI at http://192.168.189.136:4040
18/02/17 12:03:57 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/02/17 12:03:57 INFO MemoryStore: MemoryStore cleared
18/02/17 12:03:57 INFO BlockManager: BlockManager stopped
18/02/17 12:03:57 INFO BlockManagerMaster: BlockManagerMaster stopped
18/02/17 12:03:57 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/02/17 12:03:57 INFO SparkContext: Successfully stopped SparkContext
18/02/17 12:03:57 INFO ShutdownHookManager: Shutdown hook called
18/02/17 12:03:57 INFO ShutdownHookManager: Deleting directory /tmp/spark-6addf0da-f076-4dd1-a5eb-38dca93a2ad6
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<configuration>
<property>
<name>hive.metastore.uris</name>
<value>thrift://localhost:9083</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/hcatalog?createDatabaseIfNotExist=true</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>vm4learning</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>hive.hwi.listen.host</name>
<value>localhost</value>
</property>
<property>
<name>hive.hwi.listen.port</name>
<value>9999</value>
</property>
<property>
<name>hive.hwi.war.file</name>
<value>lib/hive-hwi-0.11.0.war</value>
</property>
</configuration>