我想加载一个如下所示的CSV文件:
Acct. No.,1-15 Days,16-30 Days,31-60 Days,61-90 Days,91-120 Days,Beyond 120 Days
2314134101,898.89,8372.16,5584.23,7744.41,9846.54,2896.25
2414134128,5457.61,7488.26,9594.02,6234.78,273.7,2356.13
2513918869,2059.59,7578.59,9395.51,7159.15,5827.48,3041.62
1687950783,4846.85,8364.22,9892.55,7213.45,8815.33,7603.4
2764856043,5250.11,9946.49,8042.03,6058.64,9194.78,8296.2
2865446086,596.22,7670.04,8564.08,3263.85,9662.46,7027.22
,4725.99,1336.24,9356.03,1572.81,4942.11,6088.94
,8248.47,956.81,8713.06,2589.14,5316.68,1543.67
,538.22,1473.91,3292.09,6843.89,2687.07,9808.05
,9885.85,2730.72,6876,8024.47,1196.87,1655.29
但是如果你注意到,有些字段是不完整的。我想MySQL只会跳过缺少第一列的行。当我运行命令时:
LOAD DATA LOCAL INFILE 'test-long.csv' REPLACE INTO TABLE accounts
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(cf_535, cf_580, cf_568, cf_569, cf_571, cf_572);
MySQL的输出是:
Query OK, 41898 rows affected, 20948 warnings (0.78 sec)
Records: 20949 Deleted: 20949 Skipped: 0 Warnings: 20948
行数仅为20,949,但MySQL将其报告为受影响的41,898行。为什么这样?此外,表格中没有任何改变。我也无法看到所产生的警告是什么。我想使用LOAD DATA INFILE,因为它需要python半秒来更新每一行,对于一个包含20,000多条记录的文件,它会转换为2.77小时。
UPDATE:修改代码以将自动提交设置为“False”并添加了db.commit()语句:
# Tell MySQLdb to turn off auto-commit
db.autocommit(False)
# Set count to 1
count = 1
while count < len(contents):
if contents[count][0] != '':
cursor.execute("""
UPDATE accounts SET cf_580 = %s, cf_568 = %s, cf_569 = %s, cf_571 = %s, cf_572 = %s
WHERE cf_535 = %s""" % (contents[count][1], contents[count][2], contents[count][3], contents[count][4], contents[count][5], contents[count][0]))
count += 1
try:
db.commit()
except:
db.rollback()
答案 0 :(得分:2)
这里基本上有3个问题。按相反的顺序
如果你的python程序即使只有一个事务也不能足够快地执行,你至少应该让python程序在导入之前编辑/清理数据文件。如果是Acct。 No.是主键,因为看起来合理,插入带空白的行会导致整个导入失败,或者如果启用了自动编号,则导致导入伪造数据。
答案 1 :(得分:0)
如果在LOAD DATA中使用REPLACE关键字,则“已删除:”之后的数字表示实际替换了多少行