spark-shell,对象XXX不是包YYY的成员

时间:2016-01-14 15:24:23

标签: apache-spark

我正在使用spark-shell来帮助我开发一个独立的火花程序。

当我通过spark-submit运行我的程序时,geohex包被正确导入。

package com.verve.parentchild

import org.apache.spark._
import org.apache.spark.SparkContext._
import org.elasticsearch.spark._
import net.geohex._
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.types.{FloatType, DateType, StructType, StructField, StringType, IntegerType};

object Sessions {
    def main(args: Array[String]) {

但是当我尝试将GeoHex包导入spark-shell时,我收到错误

➜  spark git:(master) ✗ spark-shell       
16/01/14 07:11:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/01/14 07:11:29 INFO SecurityManager: Changing view acls to: jspooner
16/01/14 07:11:29 INFO SecurityManager: Changing modify acls to: jspooner
16/01/14 07:11:29 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jspooner); users with modify permissions: Set(jspooner)
16/01/14 07:11:29 INFO HttpServer: Starting HTTP Server
16/01/14 07:11:29 INFO Utils: Successfully started service 'HTTP class server' on port 55790.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.4.0
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
16/01/14 07:11:32 INFO SparkContext: Running Spark version 1.4.0
16/01/14 07:11:32 INFO SecurityManager: Changing view acls to: jspooner
16/01/14 07:11:32 INFO SecurityManager: Changing modify acls to: jspooner
16/01/14 07:11:32 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jspooner); users with modify permissions: Set(jspooner)
16/01/14 07:11:32 INFO Slf4jLogger: Slf4jLogger started
16/01/14 07:11:32 INFO Remoting: Starting remoting
16/01/14 07:11:32 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.1.130:55793]
16/01/14 07:11:32 INFO Utils: Successfully started service 'sparkDriver' on port 55793.
16/01/14 07:11:32 INFO SparkEnv: Registering MapOutputTracker
16/01/14 07:11:32 INFO SparkEnv: Registering BlockManagerMaster
16/01/14 07:11:32 INFO DiskBlockManager: Created local directory at /private/var/folders/5d/q9yy3dwn2kv6q3xkqqb5dfv80000gp/T/spark-a2b2a7ea-54c8-465f-b9c5-3a7c07571879/blockmgr-61509001-188c-4a31-b417-f593b69753bb
16/01/14 07:11:32 INFO MemoryStore: MemoryStore started with capacity 265.1 MB
16/01/14 07:11:32 INFO HttpFileServer: HTTP File server directory is /private/var/folders/5d/q9yy3dwn2kv6q3xkqqb5dfv80000gp/T/spark-a2b2a7ea-54c8-465f-b9c5-3a7c07571879/httpd-235c0b35-10d6-4f47-848f-b8c43b52a2bd
16/01/14 07:11:32 INFO HttpServer: Starting HTTP Server
16/01/14 07:11:32 INFO Utils: Successfully started service 'HTTP file server' on port 55794.
16/01/14 07:11:32 INFO SparkEnv: Registering OutputCommitCoordinator
16/01/14 07:11:32 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/01/14 07:11:32 INFO SparkUI: Started SparkUI at http://192.168.1.130:4040
16/01/14 07:11:32 INFO Executor: Starting executor ID driver on host localhost
16/01/14 07:11:32 INFO Executor: Using REPL class URI: http://192.168.1.130:55790
16/01/14 07:11:32 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55795.
16/01/14 07:11:32 INFO NettyBlockTransferService: Server created on 55795
16/01/14 07:11:32 INFO BlockManagerMaster: Trying to register BlockManager
16/01/14 07:11:32 INFO BlockManagerMasterEndpoint: Registering block manager localhost:55795 with 265.1 MB RAM, BlockManagerId(driver, localhost, 55795)
16/01/14 07:11:32 INFO BlockManagerMaster: Registered BlockManager
16/01/14 07:11:32 INFO SparkILoop: Created spark context..
Spark context available as sc.
16/01/14 07:11:33 INFO HiveContext: Initializing execution hive, version 0.13.1
16/01/14 07:11:33 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
16/01/14 07:11:33 INFO ObjectStore: ObjectStore, initialize called
16/01/14 07:11:33 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
16/01/14 07:11:33 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
16/01/14 07:11:33 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/01/14 07:11:33 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/01/14 07:11:34 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
16/01/14 07:11:34 INFO MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5.  Encountered: "@" (64), after : "".
16/01/14 07:11:35 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/01/14 07:11:35 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/01/14 07:11:35 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
16/01/14 07:11:35 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
16/01/14 07:11:35 INFO ObjectStore: Initialized ObjectStore
16/01/14 07:11:35 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
16/01/14 07:11:35 INFO HiveMetaStore: Added admin role in metastore
16/01/14 07:11:35 INFO HiveMetaStore: Added public role in metastore
16/01/14 07:11:35 INFO HiveMetaStore: No user is added in admin role, since config is empty
16/01/14 07:11:36 INFO SessionState: No Tez session required at this point. hive.execution.engine=mr.
16/01/14 07:11:36 INFO SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.

scala> import net.geohex._
<console>:19: error: object geohex is not a member of package net
       import net.geohex._
                  ^

scala> 

Tathagata Das在这个article中说他们将这些非核心功能分成不同的子项目,这样他们的依赖关系就不会碰撞/污染核心火花的那些。他的例子是一个试图导入Twitter流的人,但我只是想导入一个简单的geoHex类。

1 个答案:

答案 0 :(得分:0)

在执行shell explained here时尝试在命令行上使用TypeError: unorderable types: str() > int()

您可以使用--driver-library-path

获得类似的帮助