本地apache-spark通过v4身份验证连接到AWS S3

时间:2017-11-02 12:00:43

标签: python-3.x apache-spark amazon-s3

我的任务是用火花处理卫星图像。图像存储在欧洲中部的S3。要访问这些文件,我需要v4身份验证api兼容的S3客户端。

我的pyspark代码在美国存储桶上运行没有任何问题但由于它使用v4身份验证而无法使用EU存储桶。我按照this教程并应用了this问题排查指南。但我仍然得到这个错误:

$ pyspark --properties-file s3.properties
...
>>> image_rdd = sc.binaryFiles('s3a://sentinel-s2-l1c/tiles/31/U/FT/2017/10/15/0/preview.jp2')
...
py4j.protocol.Py4JJavaError: An error occurred while calling o19.binaryFiles.
: com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 400, AWS Service: Amazon S3, AWS Request ID: BD9957E5F3960247, AWS Error Code: null, AWS Error Message: Bad Request, S3 Extended Request ID: oCgfA+foevj6CEFWO0F22H+AVbqr4F0hr7c4M7OlILxOSb0ZZ25FqHhZnzgyLxRPMuiyeOdjnSM=
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:798)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:421)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:232)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3528)
at com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1031)
at com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:994)

我的s3.properties文件:

spark.hadoop.fs.s3a.impl        org.apache.hadoop.fs.s3a.S3AFileSystem
spark.driver.extraClassPath     hadoop-aws-2.7.3.jar:aws-java-sdk-1.7.4.jar
spark.hadoop.fs.s3a.endpoint    s3.eu-central-1.amazonaws.com
spark.hadoop.fs.s3a.access.key  [access_key]
spark.hadoop.fs.s3a.secret.key  [secret_key]
spark.hadoop.fs.s3a.impl.disable.cache true

我的设置:

  • MACOS
  • 使用Scala版本2.11.8的本地apache-spark,Java HotSpot(TM)64位服务器VM,1.8.0_144
  • Python 3.6.3

问题:使用v4身份验证配置pyspark以连接S3的正确方法是什么?我应该在哪里写配置参数?

0 个答案:

没有答案