java.lang.IllegalAccessError从Java读取AWS S3配置时出错

时间:2018-04-19 05:36:06

标签: java hadoop amazon-s3

尝试从Java访问配置时,我遇到了以下错误。

Exception in thread "main" java.lang.IllegalAccessError: tried to access method org.apache.hadoop.metrics2.lib.MutableCounterLong.<init>(Lorg/apache/hadoop/metrics2/MetricsInfo;J)V from class org.apache.hadoop.fs.s3a.S3AInstrumentation
    at org.apache.hadoop.fs.s3a.S3AInstrumentation.streamCounter(S3AInstrumentation.java:164)
    at org.apache.hadoop.fs.s3a.S3AInstrumentation.streamCounter(S3AInstrumentation.java:186)
    at org.apache.hadoop.fs.s3a.S3AInstrumentation.<init>(S3AInstrumentation.java:113)
    at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:199)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
    at MyProgram.GetHiveTableData(MyProgram.java:710)
    at MyProgram$1.run(MyProgram.java:674)
    at MyProgram$1.run(MyProgram.java:670)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at MyProgram.GetHiveTableDetails(MyProgram.java:670)
    at MyProgram.main(MyProgram.java:398)

代码行是

FileSystem hdfs = FileSystem.get(new URI(uriStr), configuration);

uriStr = S3A:// sBucketName

S3A

的配置设置如下
fs.default.name=fs.defaultFS
fs.defaultFS=s3a://bucketName
sPath: XXXXXX
fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem
fs.s3a.access.key=XXXXXX
fs.s3a.secret.key=XXXXXXX
fs.s3a.endpoint=XXXXXXX
hadoop.rpc.protection=privacy
dfs.data.transfer.protection=privacy
hadoop.security.authentication=Kerberos
dfs.namenode.kerberos.principal=hdfs/XXXX@XXXX.XXX.XXXXXX.XXX
yarn.resourcemanager.principal=yarn/XXXX@XXXX.XXX.XXXXXX.XXX

我在配置设置中遗漏了什么吗? 请指教。

1 个答案:

答案 0 :(得分:0)

如果aws-sdk版本和hadoop版本不兼容,则可能会出现此问题,您可能会从Spark job reading from S3 on Spark cluster gives IllegalAccessError: tried to access method MutableCounterLongjava.lang.NoClassDefFoundError: org/apache/hadoop/fs/StorageStatistics那里获得更多帮助

当我将hadoop-aws版本从2.8.0回滚到2.7.3时,问题已解决。

spark-submit --master local \
--packages org.apache.hadoop:hadoop-aws:2.7.3,\
com.amazonaws:aws-java-sdk-pom:1.10.6,\
org.apache.hadoop:hadoop-common:2.7.3 \
test_s3.py

根据这里https://stackoverflow.com/a/52828978/8025086的讨论,似乎使用aws-java-sdk1.7.4是合适的,我只是用pyspark测试了这个简单的示例,它也可以工作。我不是Java专家,也许有人可以提供更好的解释。

# this one also works, notice that the version of aws-java-sdk is different
spark-submit --master local \
--packages org.apache.hadoop:hadoop-aws:2.7.3,\
com.amazonaws:aws-java-sdk:1.7.4,\
org.apache.hadoop:hadoop-common:2.7.3 \
test_s3.py