我正在使用如下的spark-sbumit脚本文件运行我的spark-job
export SPARK_HOME=/local/apps/analytics/spark-2.4.1-bin-hadoop2.7
$SPARK_HOME/bin/spark-submit \
--master yarn \
--deploy-mode client\
当我在集群模式下运行它时,即
--deploy-mode client\
出现如下错误:
INFO yarn.Client:
client token: N/A
diagnostics: User class threw exception: org.apache.hadoop.security.AccessControlException: Permission denied: user=analytics, access=WRITE, inode="/tmp/hadoop-egsadmin/nm-local-dir/usercache"
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:350)
如何解决此问题?怎么了?