我刚刚将Spark从2.1.1升级到2.3.0,并且在Yarn客户端模式下运行spark查询时,出现以下错误。请有人帮忙。
强制错误(代码):
Failed during initialize_connection: java.lang.RuntimeException: java.lang.RuntimeException:
java.lang.ClassNotFoundException:
Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
20/05/07 19:52:41 ERROR sparklyr:
Backend (56815) failed calling getOrCreate on 14:
java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException:
Class org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider not found
答案 0 :(得分:0)
将hadoop-yarn-client jar添加到您的类路径中,并在pom.xml中使用与其他hadoop jar相同的版本
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-client -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-client</artifactId>
<version>${hadoop.verion}</version>
</dependency>
从hadoop版本的hadoop Client jar下载jar并将其放置在Spark的lib文件夹中