SparkSession从哪里获取AWS凭证? SparkSession还是HadoopConfiguration?

时间:2018-09-27 01:13:27

标签: java apache-spark intellij-idea amazon-s3 apache-spark-sql

我正在尝试将s3中的数据读入Java中运行在IntelliJ上的代码中的Spark数据集中。我已在Spark Session配置中添加了AWS密钥,如下面的代码所述,但仍然出现以下错误。

我没有看到在SparkSession中设置的Java [https://spark.apache.org/docs/latest/api/java/index.html]中有任何等效的HadoopConfiguraion。如果有错,请在这里纠正我。

Caused by: com.amazonaws.AmazonClientException: No AWS Credentials provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: Unable to load credentials from service endpoint

这是代码:

SparkSession spark  = SparkSession
                .builder()
                .master("local")
                .config("spark.hadoop.fs.s3a.impl","org.apache.hadoop.fs.s3a.S3AFileSystem")
                .config("spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version", "2")
                .config("spark.hadoop.fs.s3a.awsAccessKeyId", AWS_KEY)
                .config("spark.hadoop.fs.s3a.awsSecretAccessKey", AWS_SECRET_KEY)
                .getOrCreate();

        JavaSparkContext sc = new JavaSparkContext(spark.sparkContext());
        //System.out.println(System.class.path);

        Dataset<Row> dF = spark.read().load("s3a://bucket/abc.parquet");

    }

这是pom.xml,其中添加了所有spark和依赖项。现在确定现在要添加什么。

<dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.3.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.3.2</version>
        </dependency>
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.417</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-aws</artifactId>
            <version>3.1.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>3.1.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>3.1.1</version>
        </dependency>
    </dependencies>

0 个答案:

没有答案