我在AWS s3上有一个存储桶,它强制所有对象都是KMS加密的。 我在emr-5.2.1上运行Presto
我在s3上有外部表(没有数据)。 当我使用
时INSERT INTO hive.s3.new_table
SELECT * FROM src_table
我收到了AccessDenied错误。 我测试了几个不同的选项,并达成支持,但没有运气。 如果我从存储桶中删除策略Presto工作正常,但在s3上创建的文件未加密。
Presto在读取加密的外部s3表或在hdfs上本地创建它们时没有任何问题。我不能允许未加密的数据。
政策示例:
{
"Version":"2012-10-17",
"Id":"PutObjPolicy",
"Statement":[{
"Sid":"DenyUnEncryptedObjectUploads",
"Effect":"Deny",
"Principal":"*",
"Action":"s3:PutObject",
"Resource":"arn:aws:s3:::YourBucket/*",
"Condition":{
"StringNotEquals":{
"s3:x-amz-server-side-encryption":"aws:kms"
}
}
}
]
}
http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingKMSEncryption.html
Presto config /etc/presto/conf/catalog/hive.properties
hive.s3.ssl.enabled=true
hive.s3.use-instance-credentials=true
hive.s3.sse.enabled = true
hive.s3.kms-key-id = long_key_id_here
...
Error:
com.facebook.presto.spi.PrestoException: Error committing write to Hive
at com.facebook.presto.hive.HiveRecordWriter.commit(HiveRecordWriter.java:132)
at com.facebook.presto.hive.HiveWriter.commit(HiveWriter.java:49)
at com.facebook.presto.hive.HivePageSink.doFinish(HivePageSink.java:152)
at com.facebook.presto.hive.authentication.NoHdfsAuthentication.doAs(NoHdfsAuthentication.java:23)
at com.facebook.presto.hive.HdfsEnvironment.doAs(HdfsEnvironment.java:76)
at com.facebook.presto.hive.HivePageSink.finish(HivePageSink.java:144)
at com.facebook.presto.spi.classloader.ClassLoaderSafeConnectorPageSink.finish(ClassLoaderSafeConnectorPageSink.java:49)
at com.facebook.presto.operator.TableWriterOperator.finish(TableWriterOperator.java:156)
at com.facebook.presto.operator.Driver.processInternal(Driver.java:394)
at com.facebook.presto.operator.Driver.processFor(Driver.java:301)
at com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:622)
at com.facebook.presto.execution.TaskExecutor$PrioritizedSplitRunner.process(TaskExecutor.java:534)
at com.facebook.presto.execution.TaskExecutor$Runner.run(TaskExecutor.java:670)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: xxxxxx), S3 Extended Request ID: xxxxxxxxxxxxxx+xxx=
at com.facebook.presto.hive.PrestoS3FileSystem$PrestoS3OutputStream.uploadObject(PrestoS3FileSystem.java:1003)
at com.facebook.presto.hive.PrestoS3FileSystem$PrestoS3OutputStream.close(PrestoS3FileSystem.java:967)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:74)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:108)
at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:2429)
at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:106)
at com.facebook.presto.hive.HiveRecordWriter.commit(HiveRecordWriter.java:129)
... 15 more
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: xxxxxxx)
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1387)
at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:940)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:715)
at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:466)
at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:427)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:376)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4039)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1583)
at com.amazonaws.services.s3.AmazonS3EncryptionClient.access$101(AmazonS3EncryptionClient.java:80)
at com.amazonaws.services.s3.AmazonS3EncryptionClient$S3DirectImpl.putObject(AmazonS3EncryptionClient.java:603)
at com.amazonaws.services.s3.internal.crypto.S3CryptoModuleBase.putObjectUsingMetadata(S3CryptoModuleBase.java:175)
at com.amazonaws.services.s3.internal.crypto.S3CryptoModuleBase.putObjectSecurely(S3CryptoModuleBase.java:161)
at com.amazonaws.services.s3.internal.crypto.CryptoModuleDispatcher.putObjectSecurely(CryptoModuleDispatcher.java:108)
at com.amazonaws.services.s3.AmazonS3EncryptionClient.putObject(AmazonS3EncryptionClient.java:483)
at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:131)
at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:123)
at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:139)
at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:47)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
我是否遗漏了配置中的内容,或者Presto在插入表格时没有使用KMS?
根据amazon的说法:"如果不是通过SSL或使用SigV4制作,那么受AWS KMS保护的对象的所有GET和PUT请求都将失败。"
答案 0 :(得分:0)
Presto现在通过hive.s3.sse.kms-key-id
Hive连接器配置属性支持SSE-KMS。