如何使用Camel aws-s3 producer将文件上传到AWS S3?

时间:2015-09-01 15:24:33

标签: java amazon-s3 apache-camel

我尝试使用Camel的aws-s3制作工具将jpg文件上传到AWS S3存储桶。我能用这种方法做这项工作,如果是的话怎么样?现在我只得到一个IOException并且无法弄清楚下一步会是什么。我知道我可以使用来自aws-sdk的TransferManager实现上传,但现在我只对Camel的aws-s3端点感兴趣。

这是我使用Camel 2.15.3的路线:

public void configure() {
    from("file://src/data?fileName=file.jpg&noop=true&delay=15m")
    .setHeader(S3Constants.KEY,constant("CamelFile"))
    .to("aws-s3://<bucket-name>?region=eu-west-1&accessKey=<key>&secretKey=RAW(<secret>)");
}

以及我从运行该路线获得的异常:

com.amazonaws.AmazonClientException: Unable to create HTTP entity: Stream Closed
at com.amazonaws.http.HttpRequestFactory.newBufferedHttpEntity(HttpRequestFactory.java:244)
at com.amazonaws.http.HttpRequestFactory.createHttpRequest(HttpRequestFactory.java:122)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:415)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:273)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3660)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1432)
at org.apache.camel.component.aws.s3.S3Producer.processSingleOp(S3Producer.java:209)
at org.apache.camel.component.aws.s3.S3Producer.process(S3Producer.java:71)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:129)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:448)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:118)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:80)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.component.file.GenericFileConsumer.processExchange(GenericFileConsumer.java:439)
at org.apache.camel.component.file.GenericFileConsumer.processBatch(GenericFileConsumer.java:211)
at org.apache.camel.component.file.GenericFileConsumer.poll(GenericFileConsumer.java:175)
at org.apache.camel.impl.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:174)
at org.apache.camel.impl.ScheduledPollConsumer.run(ScheduledPollConsumer.java:101)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Stream Closed
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:246)
at com.amazonaws.services.s3.internal.RepeatableInputStream.read(RepeatableInputStream.java:167)
at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:73)
at com.amazonaws.services.s3.internal.MD5DigestCalculatingInputStream.read(MD5DigestCalculatingInputStream.java:88)
at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:73)
at com.amazonaws.event.ProgressInputStream.read(ProgressInputStream.java:151)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at org.apache.http.util.EntityUtils.toByteArray(EntityUtils.java:136)
at org.apache.http.entity.BufferedHttpEntity.<init>(BufferedHttpEntity.java:63)
at com.amazonaws.http.HttpRequestFactory.newBufferedHttpEntity(HttpRequestFactory.java:242)
... 27 more

1 个答案:

答案 0 :(得分:4)

I did some digging and found one solution. Route works if you convert file contents to byte array before passing it to the aws-s3 endpoint like this:

from("file://src/data?fileName=file.jpg&noop=true&delay=15m")
    .convertBodyTo(byte[].class)
    .setHeader(S3Constants.CONTENT_LENGTH, simple("${in.header.CamelFileLength}"))
    .setHeader(S3Constants.KEY,simple("${in.header.CamelFileNameOnly}"))
    .to("aws-s3://{{awsS3BucketName}}"
                    + "?deleteAfterWrite=false&region=eu-west-1"
                    + "&accessKey={{awsAccessKey}}"
                    + "&secretKey=RAW({{awsAccessKeySecret}})")
    .log("done.");
}

There must also be S3Constants.CONTENT_LENGTH header value set to the file length in bytes.

The solution above reads whole file to memory so it's not ideal to every situation. However the code above is also the most simple way that I know of using aws-s3 producer endpoint. I'm still happy to hear about other (and better) solutions.