实例化HiveSessionState时Spark2数据加载问题

时间:2017-12-26 14:21:52

标签: hadoop apache-spark-2.0

在使用群集模式的Spark2读取数据期间获得以下问题。 " java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':" 谷歌搜索后我对这个问题绝对无能为力。请帮忙。

我运行的代码

spark = SparkSession.builder.getOrCreate();

val lines: Dataset[String] = spark.read.textFile("/data/sample/abc.csv").

异常来自上线。

异常完整堆栈跟踪:

ERROR yarn.ApplicationMaster: User class threw exception: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
    at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
    at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
    at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:549)
    at org.apache.spark.sql.SparkSession.read(SparkSession.scala:605)
    at com.abcd.Learning$.main(Learning.scala:26)
    at com.abcd.Learning.main(Learning.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:646)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
    ... 11 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
    at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
    at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
    at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
    at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
    at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
    ... 16 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
    ... 24 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
    at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:353)
    at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:257)
    at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
    ... 29 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:548)
    at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
    ... 37 more
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2685)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2705)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:97)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2748)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2730)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:385)
    at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:356)
    at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:666)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:593)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:526)
    ... 38 more
Caused by: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
    ... 48 more 

1 个答案:

答案 0 :(得分:0)

here给我的解决方案类似。

我做了以下

  • 将spark jars目录压缩到此处:/usr/local/Cellar/apache-spark/2.1.0/libexec/jars,并将其命名为spark-jars.zip
  • spark-jars.zip复制到hdfs:$ hdfs dfs -copyFromLocal /usr/local/Cellar/apache-spark/2.1.0/libexec/spark-jars.zip hdfs:/user/<username>/
  • 在执行spark作业时传递了配置中的spark-jars.zip位置:$ HADOOP_CONF_DIR=/Users/<username>/hadoop_conf spark-submit --conf spark.yarn.archive=hdfs:/user/<username>/spark-jars.zip --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true --class "com.<whatever>.<package>" --master yarn --deploy-mode cluster --queue online1 --driver-memory 3G --executor-memory 3G ./build/libs/<main class>.jar