我在将一个zip文件上传到Azure以获取webjob时出现了问题。
当zip文件大小小于10MB时,一切顺利,但是当尝试上传较大的文件时,我得到了这个例外:
流不支持并发IO读取或写入操作
堆栈:
- at System.Net.ConnectStream.InternalWrite(Boolean async,Byte [] buffer,Int32 offset,Int32 size,AsyncCallback callback,Object state)
- at System.Net.ConnectStream.Write(Byte [] buffer,Int32 offset,Int32 size)
- at System.Net.WebClient.UploadBitsState.WriteBytes()
- 在System.Net.WebClient.UploadBits(WebRequest请求,Stream readStream,Byte []缓冲区,Int32 chunkSize,Byte []标题,Byte []页脚,CompletionDelegate uploadCompletionDelegate,CompletionDelegate downloadCompletionDelegate,AsyncOperation asyncOp)
- at System.Net.WebClient.UploadFile(Uri address,String method,String fileName)
这是我的代码:
var client = new MyWebClient
{
Credentials = new NetworkCredential(webSite.UserName, webSite.Password),
};
client.Headers.Add(HttpRequestHeader.ContentType, "application/zip");
client.Headers.Add("Content-Disposition", $"attachment; filename={appFile}");
var response = client.UploadFile(uploadUri, "PUT", filePath);
其中MyWebClient
是WebClient
的实现,我需要设置超时:
private class MyWebClient : WebClient
{
protected override WebRequest GetWebRequest(Uri uri)
{
WebRequest w = base.GetWebRequest(uri);
w.Timeout = 20 * 60 * 1000;
return w;
}
}
任何关于尺寸限制在哪里的想法?我确定它是客户端,并且不依赖.Net Framework,因为我测试了其中几个。
修改 解决方案是通过将WebClient强制转换为HttpWebRequest来强制AllowWriteStreamBuffering为false:
protected override WebRequest GetWebRequest(Uri uri)
{
WebRequest w = base.GetWebRequest(uri);
w.Timeout = 20 * 60 * 1000;
var httpRequest = w as HttpWebRequest;
if (httpRequest != null)
{
httpRequest.AllowWriteStreamBuffering = false;
}
return w;
}
答案 0 :(得分:1)
此博客条目http://vikeed.blogspot.co.uk/2011/03/uploading-large-files-using-http-put-in.html中记录了此行为,建议添加以下内容;
HttpWebRequest request = new ...
request.AllowWriteStreamBuffering = false;
此处发布了一个相同的问题C# HttpClient PUT,表示使用AllowAutoRedirect = false
会导致此错误。
您可以尝试按照这两个相似帖子中的建议修改设置,看看它是否可以纠正您的错误。