Presto Hive连接器S3访问被拒绝

时间:2019-05-09 23:53:42

标签: hive presto

我的S3存储桶上有一些配置单元外部表。要从presto访问配置单元外部表,我为每个S3存储桶创建了prestoive配置单元目录,因为它们在$ {PRESTO_HOME} / etc / catalog下具有不同的aws_access_key和aws_secret_key。

s3存储桶A的

hive_A.properties

connector.name=hive-hadoop2
hive.s3.use-instance-credentials=false
hive.metastore.uri=thrift://namenode:9083
hive.s3.aws-access-key=xxxxxxxxx1
hive.s3.aws-secret-key=xxxxxxxxx1
hive.non-managed-table-writes-enabled=true
hive.allow-drop-table=true
s3存储桶B的

hive_B.properties

connector.name=hive-hadoop2
hive.s3.use-instance-credentials=false
hive.metastore.uri=thrift://namenode:9083
hive.s3.aws-access-key=yyyyyyyyy2
hive.s3.aws-secret-key=yyyyyyyyy2
hive.non-managed-table-writes-enabled=true
hive.allow-drop-table=true

启动Presto之后,我使用hive_B.properties将presto查询发送到s3存储桶A中的一个外部表。因此,presto肯定给了我这样的拒绝访问错误。

com.facebook.presto.spi.PrestoException: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: FB04E851EDD5E8E3; S3 Extended Request ID: HbIz8TKv7rZ11fVOA+9Hw/ikuRY6635I/fZ1tieCiYs9EWa56kLUZhOGacpvFYfZzWzIG09g2XQ=)
    at com.facebook.presto.hive.BackgroundHiveSplitLoader$HiveSplitLoaderTask.process(BackgroundHiveSplitLoader.java:194)
    at com.facebook.presto.hive.util.ResumableTasks.safeProcessTask(ResumableTasks.java:47)
    at com.facebook.presto.hive.util.ResumableTasks.access$000(ResumableTasks.java:20)
    at com.facebook.presto.hive.util.ResumableTasks$1.run(ResumableTasks.java:35)
    at io.airlift.concurrent.BoundedExecutor.drainQueue(BoundedExecutor.java:78)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: FB04E851EDD5E8E3; S3 Extended Request ID: HbIz8TKv7rZ11fVOA+9Hw/ikuRY6635I/fZ1tieCiYs9EWa56kLUZhOGacpvFYfZzWzIG09g2XQ=)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1695)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1350)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1101)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:758)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:732)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:714)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:674)
    at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:656)
    at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:520)
    at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4443)
    at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4390)
    at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4384)
    at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:844)
    at com.facebook.presto.hive.s3.PrestoS3FileSystem.listPrefix(PrestoS3FileSystem.java:490)
    at com.facebook.presto.hive.s3.PrestoS3FileSystem.access$000(PrestoS3FileSystem.java:146)
    at com.facebook.presto.hive.s3.PrestoS3FileSystem$1.<init>(PrestoS3FileSystem.java:278)
    at com.facebook.presto.hive.s3.PrestoS3FileSystem.listLocatedStatus(PrestoS3FileSystem.java:276)
    at org.apache.hadoop.fs.FilterFileSystem.listLocatedStatus(FilterFileSystem.java:263)
    at com.facebook.presto.hive.HadoopDirectoryLister.list(HadoopDirectoryLister.java:30)
    at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.<init>(HiveFileIterator.java:130)
    at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.<init>(HiveFileIterator.java:118)
    at com.facebook.presto.hive.util.HiveFileIterator.getLocatedFileStatusRemoteIterator(HiveFileIterator.java:107)
    at com.facebook.presto.hive.util.HiveFileIterator.computeNext(HiveFileIterator.java:100)
    at com.facebook.presto.hive.util.HiveFileIterator.computeNext(HiveFileIterator.java:37)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:141)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:136)
    at java.util.Spliterators$IteratorSpliterator.tryAdvance(Spliterators.java:1811)
    at java.util.stream.StreamSpliterators$WrappingSpliterator.lambda$initPartialTraversalState$0(StreamSpliterators.java:294)
    at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.fillBuffer(StreamSpliterators.java:206)
    at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.doAdvance(StreamSpliterators.java:161)
    at java.util.stream.StreamSpliterators$WrappingSpliterator.tryAdvance(StreamSpliterators.java:300)
    at java.util.Spliterators$1Adapter.hasNext(Spliterators.java:681)
    at com.facebook.presto.hive.BackgroundHiveSplitLoader.loadSplits(BackgroundHiveSplitLoader.java:261)
    at com.facebook.presto.hive.BackgroundHiveSplitLoader.access$300(BackgroundHiveSplitLoader.java:93)
    at com.facebook.presto.hive.BackgroundHiveSplitLoader$HiveSplitLoaderTask.process(BackgroundHiveSplitLoader.java:187)

因此,我尝试使用正确的hive_A.propreties查询,但Presto一直给我拒绝访问的错误。当我重新启动presto并在发送错误的hive_A.properties之前再次发送正确的hive_A.properties查询时,获取结果没有问题。

有人知道如何解决使用错误的配置单元连接器问题的问题,该问题会导致s3访问被拒绝错误,而无需重新启动presto并先运行正确的配置单元连接器查询,然后再运行错误的配置单元?

我已经尝试过在

之类的配置单元连接器文件中添加一些属性。
hive.metastore-cache-ttl=10s
hive.metastore-refresh-interval=10s

但这没用。

0 个答案:

没有答案