AmazonS3ClientBuilder在Qubole hadoop

时间:2018-07-12 21:30:18

标签: java hadoop mapreduce qubole

请考虑以下内容

public class myMapper extends Mapper<Object, Text, Text, Text> {
    static {
        try {
            AWSCredentials credentials = new BasicAWSCredentials(
                "my_access_key",
                "my_secret_key"
            );

            AmazonS3 s3client = AmazonS3ClientBuilder
                    .standard()
                    .withCredentials(new AWSStaticCredentialsProvider(credentials))
                    .withRegion(Regions.US_EAST_1)
                    .build();
        } catch(Exception e) {
            System.out.println("catch" + e);
        }
    }

    public void map(Object key, Text value, Context context
    ) throws IOException, InterruptedException { 

    }
}

出于某些原因,此行

        AmazonS3 s3client = AmazonS3ClientBuilder
                .standard()
                .withCredentials(new AWSStaticCredentialsProvider(credentials))
                .withRegion(Regions.US_EAST_1)
                .build();

正在导致hadoop错误输出并输出

Container id: container_id
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:554)
at org.apache.hadoop.util.Shell.run(Shell.java:469)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:741)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:310)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.

以上代码在本地运行良好。不知道为什么会出错,但是我要完成的工作是每次初始化映射器时都需要从s3获取静态映射。

有没有办法从hadoop qubole中的每个映射器的s3中下载静态映射?

0 个答案:

没有答案