Zeppelin:实例化'org.apache.spark.sql.hive.HiveSessionStateBuilder'时出错:

时间:2018-04-27 12:43:26

标签: scala apache-spark hive apache-zeppelin

我在Zeppelin中运行代码时遇到以下错误,而相同的代码在Spark shell中运行得非常好。我有Zeppelin 0.7.3,Spark 2.2和Hive 1.2.1

val hierbtmupDF=spark.read.format("jdbc").option("url","jdbc:teradata://teradata-dns-sysa.xx.xxxx.com").option("driver","com.teradata.jdbc.TeraDriver").option("dbtable","xxxx.xxxx").option("user","xxxxx").option("password","xxxxxxxx").load()

java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1074)
  at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:141)
  at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:140)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:140)
  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:137)
  at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:689)
  at org.apache.spark.sql.SparkSession.read(SparkSession.scala:649)
  ... 54 elided
Caused by: org.apache.spark.sql.AnalysisException: java.lang.NoSuchMethodError: com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V;
  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
  at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193)
  at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
  at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
  at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1071)
  ... 61 more
Caused by: java.lang.NoSuchMethodError: com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V
  at com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
  at com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430)
  at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:492)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:497)
  at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
  at com.sun.proxy.$Proxy23.close(Unknown Source)
  at org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:291)
  at org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.java:137)
  at org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:157)
  at org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(Hive.java:261)
  at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:231)
  at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:208)
  at org.apache.hadoop.hive.ql.session.SessionState.setAuthorizerV2Config(SessionState.java:765)
  at org.apache.hadoop.hive.ql.session.SessionState.setupAuth(SessionState.java:736)
  at org.apache.hadoop.hive.ql.session.SessionState.getAuthenticator(SessionState.java:1391)
  at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:211)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
  at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:268)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362)
  at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
  at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
  at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
  at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
  at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
  ... 70 more

请告诉我这里缺少的东西。 zeppelin-env.sh中的SPARK_HOME指向Spark2

0 个答案:

没有答案