我的日志文件看起来像这样......
2009-12-18T08:25:22.983Z 1 174 dns:0-apr-credit-cards-uk.pedez.co.uk P http://0-apr-credit-cards-uk.pedez.co.uk/ text/dns #170 20091218082522021+89 sha1:AIDBQOKOYI7OPLVSWEBTIAFVV7SRMLMF - -
2009-12-18T08:25:22.984Z 1 5 dns:0-60racing.co.uk P http://0-60racing.co.uk/ text/dns #116 20091218082522037+52 sha1:WMII7OOKYQ42G6XPITMHJSMLQFLGCGMG - -
2009-12-18T08:25:23.066Z 1 79 dns:0-addiction.metapress.com.wam.leeds.ac.uk P http://0-addiction.metapress.com.wam.leeds.ac.uk/ text/dns #042 20091218082522076+20 sha1:NSUQN6TBIECAP5VG6TZJ5AVY34ANIC7R - -
...plus millions of other records
我需要将它们转换为csv文件...
"2009-12-18T08:25:22.983Z","1","174","dns:0-apr-credit-cards-uk.pedez.co.uk","P","http://0-apr-credit-cards-uk.pedez.co.uk/","text/dns","#170","20091218082522021+89","sha1:AIDBQOKOYI7OPLVSWEBTIAFVV7SRMLMF","-","-"
"2009-12-18T08:25:22.984Z","1","5","dns:0-60racing.co.uk","P","http://0-60racing.co.uk/","text/dns","#116","20091218082522037+52","sha1:WMII7OOKYQ42G6XPITMHJSMLQFLGCGMG","-","-"
"2009-12-18T08:25:23.066Z","1","79","dns:0-addiction.metapress.com.wam.leeds.ac.uk","P","http://0-addiction.metapress.com.wam.leeds.ac.uk/","text/dns","#042","20091218082522076+20","sha1:NSUQN6TBIECAP5VG6TZJ5AVY34ANIC7R","-","-"
字段分隔符可以是单个或多个空格字符,包含固定宽度字段和可变宽度字段。这往往会混淆我找到的大多数CSV解析器。
最终我想将这些文件bcp到SQL Server中,但是你只能指定一个字符作为字段分隔符(即''),这会破坏固定长度的字段。
到目前为止 - 我正在使用PowerShell
gc -ReadCount 10 -TotalCount 200 .\crawl_sample.log | foreach { ([regex]'([\S]*)\s+').matches($_) } | foreach {$_.Groups[1].Value}
,这将返回字段流:
2009-12-18T08:25:22.983Z
1
74
dns:0-apr-credit-cards-uk.pedez.co.uk
P
http://0-apr-credit-cards-uk.pedez.co.uk/
text/dns
#170
20091218082522021+89
sha1:AIDBQOKOYI7OPLVSWEBTIAFVV7SRMLMF
-
-
2009-12-18T08:25:22.984Z
1
55
dns:0-60racing.co.uk
P
http://0-60racing.co.uk/
text/dns
#116
20091218082522037+52
sha1:WMII7OOKYQ42G6XPITMHJSMLQFLGCGMG
-
但如何将该输出转换为CSV格式?
答案 0 :(得分:2)
再次回答我自己的问题......
measure-command {
$q = [regex]" +"
$q.Replace( ([string]::join([environment]::newline, (Get-Content -ReadCount 1 \crawl_sample2.log))), "," ) > crawl_sample2.csv
}
而且很快!
观察:
\s+
作为正则表达式分隔符,这是违反换行符Get-Content -ReadCount 1
将单行数组流式传输到正则表达式<强>更新强>
此脚本有效但在处理大文件时使用大量RAM。那么,如果没有8GB的RAM和交换使用,我怎么能做同样的事情呢?
我认为这是由join
再次缓冲所有数据引起的....任何想法?
更新2
好的 - 得到了更好的解决方案......
Get-Content -readcount 100 -totalcount 100000 .\crawl.log |
ForEach-Object { $_ } |
foreach { $_ -replace " +", "," } > .\crawl.csv
非常方便的Powershell指南 - Powershell regular expressions