我正在使用http处理程序(.ashx)处理文件上传。选择文件后,将调用“ uploadFile()”将其保存到临时位置。然后,读取,解析文件并将其存储在(C#)数据表中,以用于填充jQuery数据表。
使用StreamReader可以存储,打开和读取文件。但是,如果文件很大,则会花费很长时间。
我搜索了一下,看来问题出在解析表中存储的每一行。为了响应另一个用户遇到的类似问题,建议阅读每一行并将其存储在DB中(他需要存储在DB中)。但是我不能采纳适合我需要的建议,即存储在数据表中并使用它来填充表。
有什么办法可以加快速度吗?我必须在jQuery数据表中显示已解析的内容。
这是我目前拥有的(“ FilesUpload”是文件输入控件的ID,带有“ onChange”事件)
JS函数:
var logFile = [];
function onChange(oFile) {
Array.prototype.forEach.call(oFile.files, function (file) {
logFile.push({
"id": 0,
"fn": file.name,
"fl": file.size,
"fp": '',
"ct": file.type
});
uploadFile();
});
}
function uploadFile() {debugger
...
var uploadingfiles = $("#FilesUpload").get(0);
var uploadedfiles = uploadingfiles.files;
var formdata = new FormData();
for (var i = 0; i < uploadedfiles.length; i++) {
formdata.append("file", $("#MultipleFilesUpload").prop("files")[i]);
}
formdata.append("ToUpload", JSON.stringify(logFile));
formdata.append("UploadFolder", currUploadFolder);
$.ajax({
url: '<%= ResolveUrl("../UploadHandler.ashx") %>',
type: 'post',
data: formdata,
contentType: false,
cache: false,
dataType: 'script',
responseType: "json",
processData: false
}).done(function (result) {debugger
var jResult = JSON.parse(result);
...
$("#lblSelectedFile").html(jResult.FileName);
}).fail(function (jqXHR, textStatus, errorThrown) {debugger
...
});
}
HTTP处理程序(UploadHandler.ashx):
public void ProcessRequest(HttpContext context)
{
var MessageData = new object();
if (context.Request.Files.Count > 0)
{
// Do something, save file in folder
....
// Process the file
JSONresult = ProcessLogFile_BufferedStream(sUploadFolder, fn);
}
else
{
}
....
}
我认为这花费了太长时间,或者数据表变得太大了
private string ProcessLogFile_BufferedStream(string Folder, string FileName)
{
string JSONresult = string.Empty;
if (string.IsNullOrEmpty(Folder) || string.IsNullOrEmpty(FileName))
return JSONresult;
DataTable dtLog = new DataTable();
dtLog.Columns.Add("ReqTimestamp");
dtLog.Columns.Add("ReqDataLength");
...
dtIDCLog.Columns.Add("RespTimestamp");
dtIDCLog.AcceptChanges();
Regex reAsciiPatern = new Regex(@"[^\u0000-\u007F]+");
Regex ConParts = new Regex(@"^(.*?)\|(.*?)\|(.*?)\|(.*?)\|(.*?)\|(.*?)$");
string sLine;
string sTimestamp;
int iLineNo = 0;
using (FileStream fs = File.Open(Path.Combine(Folder, FileName), FileMode.Open, FileAccess.Read, FileShare.Read))
using (BufferedStream bs = new BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs))
{
while ((sLine = sr.ReadLine()) != null)
{
if (!string.IsNullOrEmpty(sLine))
{
sLine = reAsciiPatern.Replace(sLine, ""); // remove non-ASCII chars
DataRow drNew = dtIDCLog.NewRow();
Match match = ConParts.Match(sLine);
if (match.Success)
{
int i = match.Groups.Count;
// Request portion
sTimestamp = match.Groups[2].Value;
drNew["ReqTimestamp"] = sTimestamp;
string sReqLen = match.Groups[3].Value.Split(':')[0];
string sReq = match.Groups[3].Value.Split(':')[1].Replace("{", "").Replace("}", "");
drNew["ReqDataLength"] = sReqLen;
...
...
// Response portion
sTimestamp = match.Groups[4].Value;
drNew["RespTimestamp"] = sTimestamp;
...
...
dtIDCLog.Rows.Add(drNew);
iLineNo++;
}
else
{
drNew["Success"] = false;
dtIDCLog.Rows.Add(drNew);
break;
// Error, clear data table, display error
}
}
}
}
JSONresult = JsonConvert.SerializeObject(dtIDCLog);
return JSONresult;
}