我正在尝试使用blobstore上传Mp3文件。但问题是,当我上传大小超过2或3 mb的文件时。它会在关闭Closeable时抛出抛出的IOException。它并没有真正影响任何功能。但任何人都可以帮我找到这个例外吗?
我在下面附上例外情况。
com.google.appengine.tools.development.DevAppEngineWebAppContext.handle(DevAppEngineWebAppContext.java:98)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at com.google.appengine.tools.development.JettyContainerService$ApiProxyHandler.handle(JettyContainerService.java:491)
at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)
at org.mortbay.jetty.Server.handle(Server.java:326)
at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)
at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:923)
at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:547)
at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212)
at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)
at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:409)
at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)
Caused by: java.io.IOException: An existing connection was forcibly closed by the remote host
at sun.nio.ch.SocketDispatcher.write0(Native Method)
at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:51)
at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
at sun.nio.ch.IOUtil.write(IOUtil.java:51)
at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:487)
at org.mortbay.io.nio.ChannelEndPoint.flush(ChannelEndPoint.java:169)
at org.mortbay.io.nio.SelectChannelEndPoint.flush(SelectChannelEndPoint.java:221)
at org.mortbay.jetty.HttpGenerator.flush(HttpGenerator.java:721)
... 31 more
答案 0 :(得分:0)
我得到了它的工作。我在下面添加我的代码
public void getUploaded(@PathVariable String key, HttpServletRequest req, HttpServletResponse res, HttpSession session) throws IOException {
try {
BlobKey blobKey = new BlobKey(key);
BlobInfo blobInfo = new BlobInfoFactory().loadBlobInfo(blobKey);
OutputStream output = res.getOutputStream();
InputStream is = new ByteArrayInputStream(readData(blobKey,blobInfo.getSize()));
byte[] buffer = new byte[1024];
int length;
while ((length = is.read(buffer)) > 0)
output.write(buffer, 0, length);
is.close();
output.flush();
output.close();
} catch (Exception e) {
e.printStackTrace();
}
}
public byte[] readData(BlobKey blobKey, long blobSize) {
BlobstoreService blobStoreService = BlobstoreServiceFactory
.getBlobstoreService();
byte[] allTheBytes = new byte[0];
long amountLeftToRead = blobSize;
long startIndex = 0;
while (amountLeftToRead > 0) {
long amountToReadNow = Math.min(
BlobstoreService.MAX_BLOB_FETCH_SIZE - 1, amountLeftToRead);
byte[] chunkOfBytes = blobStoreService.fetchData(blobKey,
startIndex, startIndex + amountToReadNow - 1);
allTheBytes = ArrayUtils.addAll(allTheBytes, chunkOfBytes);
amountLeftToRead -= amountToReadNow;
startIndex += amountToReadNow;
}
return allTheBytes;
}