val spark = SparkSession
.builder()
.appName("try1")
.master("local")
.getOrCreate()
val df = spark.read
.json("s3n://BUCKET-NAME/FOLDER/FILE.json")
.select($"uid").show(5)
我已将AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY作为环境变量。尝试从S3读取时,我面临以下错误。
Exception in thread "main" org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException: S3 HEAD request failed for '/FOLDER%2FFILE.json' - ResponseCode=400, ResponseMessage=Bad Request
我怀疑错误是由于某些内部函数将“/”转换为“%2F”引起的,因为错误显示'/FOLDER%2FFILE.json'而不是'/FOLDER/FILE.json'
答案 0 :(得分:1)
如果您没有告诉它,您的spark(jvm)应用程序无法读取环境变量,因此请快速解决:
spark.sparkContext
.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", awsAccessKeyId)
spark.sparkContext
.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", awsSecretAccessKey)
您还需要确定s3端点:
spark.sparkContext
.hadoopConfiguration.set("fs.s3a.endpoint", "<<ENDPOINT>>");
要了解有关什么是AWS S3 Endpoint的更多信息,请参阅以下文档: