涉及对象标量的非法循环引用

时间:2017-06-20 01:50:01

标签: apache-spark

获得 线程“main”中的异常scala.reflect.internal.Symbols $ CyclicReference:涉及对象scala错误的非法循环引用 当试图运行Apache spark + scala结合sql示例程序时 Spark版本2.1.1 Scala编译器版本2.11

    import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SQLContext

object SparkSQL {

  def main(args: Array[String]) {

    val conf = new SparkConf().setAppName("SparkSQL").setMaster("local");
//    // Create a Scala Spark Context.
    val sc = new SparkContext(conf)
   val SQLContext=new SQLContext(sc)
//  
   import SQLContext.implicits._
val fruits = sc.textFile("fruits.txt").map(_.split(",")).map(frt => Fruits(frt(0).trim.toInt, frt(1), frt(2).trim.toInt)).toDF()

/**
  * Store the DataFrame Data in a Table
  */
fruits.registerTempTable("fruits")

/**
   * Select Query on DataFrame
   */
val records = SQLContext.sql("SELECT * FROM fruits")


/**
   * To see the result data of allrecords DataFrame
   */
 records.show()


  }
}

case class Fruits(id: Int, name: String, quantity: Int)

    fruits.txt contents
    1, Grapes, 25
    2, Guava, 28
    3, Gooseberry, 39
    4, Raisins, 23
    5, Naseberry, 23
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/06/20 06:41:32 INFO SparkContext: Running Spark version 2.1.1
17/06/20 06:41:33 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/06/20 06:41:33 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
    at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:93)
    at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
    at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2391)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2391)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2391)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:295)
    at SparkSQL$.main(SparkSQL.scala:11)
    at SparkSQL.main(SparkSQL.scala)
17/06/20 06:41:33 INFO SecurityManager: Changing view acls to: Sreeharsha
17/06/20 06:41:33 INFO SecurityManager: Changing modify acls to: Sreeharsha
17/06/20 06:41:33 INFO SecurityManager: Changing view acls groups to: 
17/06/20 06:41:33 INFO SecurityManager: Changing modify acls groups to: 
17/06/20 06:41:33 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Sreeharsha); groups with view permissions: Set(); users  with modify permissions: Set(Sreeharsha); groups with modify permissions: Set()
17/06/20 06:41:34 INFO Utils: Successfully started service 'sparkDriver' on port 59741.
17/06/20 06:41:34 INFO SparkEnv: Registering MapOutputTracker
17/06/20 06:41:34 INFO SparkEnv: Registering BlockManagerMaster
17/06/20 06:41:34 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/06/20 06:41:34 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/06/20 06:41:34 INFO DiskBlockManager: Created local directory at C:\Users\Sreeharsha\AppData\Local\Temp\blockmgr-e0ce1a2f-8295-4d3d-b975-ab135d40dbde
17/06/20 06:41:34 INFO MemoryStore: MemoryStore started with capacity 896.4 MB
17/06/20 06:41:34 INFO SparkEnv: Registering OutputCommitCoordinator
17/06/20 06:41:34 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/06/20 06:41:34 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.8:4040
17/06/20 06:41:34 INFO Executor: Starting executor ID driver on host localhost
17/06/20 06:41:34 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59750.
17/06/20 06:41:34 INFO NettyBlockTransferService: Server created on 192.168.1.8:59750
17/06/20 06:41:34 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/06/20 06:41:34 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.8, 59750, None)
17/06/20 06:41:34 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.8:59750 with 896.4 MB RAM, BlockManagerId(driver, 192.168.1.8, 59750, None)
17/06/20 06:41:34 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.8, 59750, None)
17/06/20 06:41:34 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.8, 59750, None)
17/06/20 06:41:35 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 208.6 KB, free 896.2 MB)
17/06/20 06:41:35 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 20.1 KB, free 896.2 MB)
17/06/20 06:41:35 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.1.8:59750 (size: 20.1 KB, free: 896.4 MB)
17/06/20 06:41:35 INFO SparkContext: Created broadcast 0 from textFile at SparkSQL.scala:24
Exception in thread "main" scala.reflect.internal.Symbols$CyclicReference: illegal cyclic reference involving object scala
    at scala.reflect.internal.Symbols$Symbol$$anonfun$info$3.apply(Symbols.scala:1502)
    at scala.reflect.internal.Symbols$Symbol$$anonfun$info$3.apply(Symbols.scala:1500)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.reflect.internal.Symbols$Symbol.lock(Symbols.scala:546)
    at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1500)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$2.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:171)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
    at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
    at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$2.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:171)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$2.info(SynchronizedSymbols.scala:171)
    at scala.reflect.internal.pickling.UnPickler$Scan.scala$reflect$internal$pickling$UnPickler$Scan$$fromName$1(UnPickler.scala:217)
    at scala.reflect.internal.pickling.UnPickler$Scan.readExtSymbol$1(UnPickler.scala:258)
    at scala.reflect.internal.pickling.UnPickler$Scan.readSymbol(UnPickler.scala:284)
    at scala.reflect.internal.pickling.UnPickler$Scan.readSymbolRef(UnPickler.scala:649)
    at scala.reflect.internal.pickling.UnPickler$Scan.readType(UnPickler.scala:417)
    at scala.reflect.internal.pickling.UnPickler$Scan$$anonfun$readTypeRef$1.apply(UnPickler.scala:658)
    at scala.reflect.internal.pickling.UnPickler$Scan$$anonfun$readTypeRef$1.apply(UnPickler.scala:658)
    at scala.reflect.internal.pickling.UnPickler$Scan.at(UnPickler.scala:179)
    at scala.reflect.internal.pickling.UnPickler$Scan.readTypeRef(UnPickler.scala:658)
    at scala.reflect.internal.pickling.UnPickler$Scan$$anonfun$readTypes$1$1.apply(UnPickler.scala:377)
    at scala.reflect.internal.pickling.UnPickler$Scan$$anonfun$readTypes$1$1.apply(UnPickler.scala:377)
    at scala.reflect.internal.pickling.PickleBuffer.until(PickleBuffer.scala:154)
    at scala.reflect.internal.pickling.UnPickler$Scan.readTypes$1(UnPickler.scala:377)
    at scala.reflect.internal.pickling.UnPickler$Scan.readType(UnPickler.scala:419)
    at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725)
    at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef$$anonfun$6.apply(UnPickler.scala:725)
    at scala.reflect.internal.pickling.UnPickler$Scan.at(UnPickler.scala:179)
    at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.completeInternal(UnPickler.scala:725)
    at scala.reflect.internal.pickling.UnPickler$Scan$LazyTypeRef.complete(UnPickler.scala:749)
    at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1514)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$2.scala$reflect$runtime$SynchronizedSymbols$SynchronizedSymbol$$super$info(SynchronizedSymbols.scala:171)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anonfun$info$1.apply(SynchronizedSymbols.scala:127)
    at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
    at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:123)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$2.gilSynchronizedIfNotThreadsafe(SynchronizedSymbols.scala:171)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$class.info(SynchronizedSymbols.scala:127)
    at scala.reflect.runtime.SynchronizedSymbols$SynchronizedSymbol$$anon$2.info(SynchronizedSymbols.scala:171)
    at scala.reflect.internal.Types$TypeRef.thisInfo(Types.scala:2194)
    at scala.reflect.internal.Types$TypeRef.baseClasses(Types.scala:2199)
    at scala.reflect.internal.tpe.FindMembers$FindMemberBase.<init>(FindMembers.scala:17)
    at scala.reflect.internal.tpe.FindMembers$FindMember.<init>(FindMembers.scala:219)
    at scala.reflect.internal.Types$Type.scala$reflect$internal$Types$Type$$findMemberInternal$1(Types.scala:1014)
    at scala.reflect.internal.Types$Type.findMember(Types.scala:1016)
    at scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:631)
    at scala.reflect.internal.Types$Type.member(Types.scala:600)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
    at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:173)
    at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:163)
    at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:163)
    at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:164)
    at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:164)
    at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1171)
    at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1170)
    at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1300)
    at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1299)
    at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:263)
    at scala.reflect.runtime.JavaMirrors$class.scala$reflect$runtime$JavaMirrors$$createMirror(JavaMirrors.scala:32)
    at scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:49)
    at scala.reflect.runtime.JavaMirrors$$anonfun$runtimeMirror$1.apply(JavaMirrors.scala:47)
    at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
    at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
    at scala.reflect.runtime.JavaMirrors$class.runtimeMirror(JavaMirrors.scala:46)
    at scala.reflect.runtime.JavaUniverse.runtimeMirror(JavaUniverse.scala:16)
    at scala.reflect.runtime.JavaMirrors$JavaMirror.mirrorDefining(JavaMirrors.scala:566)
    at scala.reflect.runtime.SymbolLoaders$PackageScope$$anonfun$lookupEntry$1.apply(SymbolLoaders.scala:139)
    at scala.reflect.runtime.SymbolLoaders$PackageScope$$anonfun$lookupEntry$1.apply(SymbolLoaders.scala:126)
    at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
    at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
    at scala.reflect.runtime.SymbolLoaders$PackageScope.syncLockSynchronized(SymbolLoaders.scala:124)
    at scala.reflect.runtime.SymbolLoaders$PackageScope.lookupEntry(SymbolLoaders.scala:126)
    at scala.reflect.internal.tpe.FindMembers$FindMemberBase.walkBaseClasses(FindMembers.scala:88)
    at scala.reflect.internal.tpe.FindMembers$FindMemberBase.searchConcreteThenDeferred(FindMembers.scala:56)
    at scala.reflect.internal.tpe.FindMembers$FindMemberBase.apply(FindMembers.scala:48)
    at scala.reflect.internal.Types$Type.scala$reflect$internal$Types$Type$$findMemberInternal$1(Types.scala:1014)
    at scala.reflect.internal.Types$Type.findMember(Types.scala:1016)
    at scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:631)
    at scala.reflect.internal.Types$Type.member(Types.scala:600)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
    at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:66)
    at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:102)
    at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:105)
    at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:257)
    at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:257)
    at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1394)
    at scala.reflect.runtime.JavaUniverse.init(JavaUniverse.scala:139)
    at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:78)
    at scala.reflect.runtime.package$.universe$lzycompute(package.scala:17)
    at scala.reflect.runtime.package$.universe(package.scala:17)
    at SparkSQL$.main(SparkSQL.scala:24)
    at SparkSQL.main(SparkSQL.scala)
17/06/20 06:41:36 INFO SparkContext: Invoking stop() from shutdown hook
17/06/20 06:41:36 INFO SparkUI: Stopped Spark web UI at http://192.168.1.8:4040
17/06/20 06:41:36 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/06/20 06:41:36 INFO MemoryStore: MemoryStore cleared
17/06/20 06:41:36 INFO BlockManager: BlockManager stopped
17/06/20 06:41:36 INFO BlockManagerMaster: BlockManagerMaster stopped
17/06/20 06:41:36 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/06/20 06:41:36 INFO SparkContext: Successfully stopped SparkContext
17/06/20 06:41:36 INFO ShutdownHookManager: Shutdown hook called
17/06/20 06:41:36 INFO ShutdownHookManager: Deleting directory C:\Users\Sreeharsha\AppData\Local\Temp\spark-ecadf545-1cae-45b2-b01f-f9a8dc5e4bd1

0 个答案:

没有答案