Spark中的Spark Hive Context hql问题 - 在yarn中运行spark job时

时间:2015-02-04 16:23:17

标签: java hive apache-spark yarn

我们有一个使用spark-submit in yarn运行的spark应用程序。运行时

java中的sparkHiveContext.hql(“show databases”)

获得以下异常

ClassLoaderResolver for class "" gave error on creation : {1} org.datanucleus.exceptions.NucleusUserException: ClassLoaderResolver for class "" gave error on creation : {1}
at org.datanucleus.NucleusContext.getClassLoaderResolver(NucleusContext.java:1087)
at org.datanucleus.PersistenceConfiguration.validatePropertyValue(PersistenceConfiguration.java:797)
at org.datanucleus.PersistenceConfiguration.setProperty(PersistenceConfiguration.java:714)
at org.datanucleus.PersistenceConfiguration.setPersistenceProperties(PersistenceConfiguration.java:693)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:273)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225)

向下移动堆栈跟踪

Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 27 more caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 32 more Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught.

但是......在spark-sql控制台中运行我的查询。这有什么问题。

1 个答案:

答案 0 :(得分:2)

这样做的原因是,如果你将你的火花应用程序打包为胖/超级jar,那么datanucleus库不喜欢它,因为每个datanucleus dependecy jar中有一个共同的pom.xml被一个覆盖。您需要将它们分别添加到类路径中。