连接到Hive上下文,给我零点异常

时间:2016-06-14 09:43:18

标签: apache-spark

我在spark standalone上使用了一个简单的Hive上下文

*/
import org.apache.commons.math.stat.descriptive.rank.Max
 object App {
 def main(args: Array[String]) {
val logFile = "src/main/resources/kv1.txt" // Should be some file on your system
val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\\n'")
sqlContext.sql("LOAD DATA LOCAL INPATH 'src/main/resources/kv1.txt' INTO TABLE src")

// Queries are expressed in HiveQL
sqlContext.sql("FROM src SELECT key, value").collect().foreach(println)

}
}

我已发布日志文件的某些部分以查看

    16/06/14 19:34:53 INFO ClientWrapper: Inspected Hadoop version: 2.2.0
16/06/14 19:34:53 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.2.0
16/06/14 19:34:53 INFO deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
16/06/14 19:34:53 INFO deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
16/06/14 19:34:53 INFO deprecation: mapred.committer.job.setup.cleanup.needed is deprecated. Instead, use mapreduce.job.committer.setup.cleanup.needed
16/06/14 19:34:53 INFO deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
16/06/14 19:34:53 INFO deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
16/06/14 19:34:53 INFO deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
16/06/14 19:34:53 INFO deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
16/06/14 19:34:53 INFO deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
16/06/14 19:34:53 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
16/06/14 19:34:53 INFO ObjectStore: ObjectStore, initialize called
16/06/14 19:34:54 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
16/06/14 19:34:54 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
16/06/14 19:34:55 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
16/06/14 19:34:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/06/14 19:34:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/06/14 19:34:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/06/14 19:34:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/06/14 19:34:57 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
16/06/14 19:34:57 INFO ObjectStore: Initialized ObjectStore
16/06/14 19:34:57 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/06/14 19:34:57 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/06/14 19:34:57 WARN : Your hostname, SSA7201E-169 resolves to a loopback/non-reachable address: 140.159.218.207, but we couldn't find any external IP address!
16/06/14 19:34:57 INFO HiveMetaStore: Added admin role in metastore
16/06/14 19:34:57 INFO HiveMetaStore: Added public role in metastore
16/06/14 19:34:58 INFO HiveMetaStore: No user is added in admin role, since config is empty
16/06/14 19:34:58 INFO HiveMetaStore: 0: get_all_databases
16/06/14 19:34:58 INFO audit: ugi=s3911541  ip=unknown-ip-addr  cmd=get_all_databases   
16/06/14 19:34:58 INFO HiveMetaStore: 0: get_functions: db=default pat=*
16/06/14 19:34:58 INFO audit: ugi=s3911541  ip=unknown-ip-addr  cmd=get_functions: db=default pat=* 
16/06/14 19:34:58 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
Exception in thread "main" java.lang.RuntimeException: java.lang.NullPointerException
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
    at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:204)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
    at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
    at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
    at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
    at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
    at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
    at App$.main(App.scala:27)
    at App.main(App.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.NullPointerException
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:404)
    at org.apache.hadoop.util.Shell.run(Shell.java:379)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
    at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
    at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:567)
    at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getPermission(RawLocalFileSystem.java:542)
    at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
    ... 17 more
16/06/14 19:34:58 INFO SparkContext: Invoking stop() from shutdown hook
16/06/14 19:34:58 INFO SparkUI: Stopped Spark web UI at http://140.159.218.207:4040
16/06/14 19:34:58 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/06/14 19:34:58 INFO MemoryStore: MemoryStore cleared
16/06/14 19:34:58 INFO BlockManager: BlockManager stopped
16/06/14 19:34:58 INFO BlockManagerMaster: BlockManagerMaster stopped
16/06/14 19:34:58 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/06/14 19:34:58 INFO SparkContext: Successfully stopped SparkContext
16/06/14 19:34:58 INFO ShutdownHookManager: Shutdown hook called
16/06/14 19:34:58 INFO ShutdownHookManager: Deleting directory C:\Users\s3911541.AD.000\AppData\Local\Temp\spark-56c72d84-b79b-4cc9-aaa6-b1818303dded
16/06/14 19:34:58 INFO ShutdownHookManager: Deleting directory C:\Users\s3911541.AD.000\AppData\Local\Temp\spark-94c1e22c-1d4c-4c40-93ea-309171c93b99
16/06/14 19:34:58 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/06/14 19:34:58 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/06/14 19:34:58 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
16/06/14 19:34:58 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\s3911541.AD.000\AppData\Local\Temp\spark-94c1e22c-1d4c-4c40-93ea-309171c93b99
java.io.IOException: Failed to delete: C:\Users\s3911541.AD.000\AppData\Local\Temp\spark-94c1e22c-1d4c-4c40-93ea-309171c93b99
    at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:928)
    at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
    at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
    at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1765)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
    at scala.util.Try$.apply(Try.scala:161)
    at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
    at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
    at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
Process finished with exit code 1

我的sbt文件很愚蠢,我没有编译任何jar ..

name := "ScalaTest2"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"

libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.6.1"

// http://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "1.6.1"

有人能指点我如何解决这个问题吗?我确信我的路径文件和文本文件是coorect ..如果有一些有任何线索

谢谢

0 个答案:

没有答案