某些记录未在PHP和MySql中使用批量更新进行更新

时间:2015-09-05 18:23:31

标签: php mysql

我有一个用于批量更新的PHP脚本,在我的数据库中存在90,585行,但只更新85,282行,我不知道为什么,这是我的脚本:

//limit the select to 100
//Bulk update is very slow, is you set a limit very high server crash
$limit = 100;
//$messages = array();

$updated_posts = 0;

//maybe there are better solutions for this
for ($i=0;$i<=$totalMsg;$i+=$limit)
{

    $get_posts = mysqli_query($conn,
    "SELECT id_msg, body
    FROM " . $to_prefix ."messages
    WHERE id_msg != 0
    LIMIT " . $i ."," . $limit);

    //The post Array
    $messages = array();
    while($row = mysqli_fetch_assoc($get_posts))
    {
        $messages[$row['id_msg']] = array(
            'body' => fixBBCTags($row['body']),
        );
    }

    //update data!, good luck!!
    bulkUpdate($conn,$to_prefix."messages","id_msg","body",$messages);

    $updated_posts += mysqli_affected_rows($conn);

}// for loop

这是bulkUpdate()函数:

function bulkUpdate($conn, $table, $id_column, $update_column, array &$idstovals)
{
global $error; 

//prepare the Bulk Update SQL
$sql = "update " . $table . " set " . $update_column . " = CASE " . $id_column ;

foreach($idstovals as $id => $val)
{
    $sql .= " WHEN " . "'" . $id . "'" . " THEN " . "'" . mysqli_real_escape_string($conn,$val[$update_column]) . "'" . " \n";
}

$sql .= " END 
WHERE " . $id_column. " in (" . implode(',', array_keys($idstovals)) . ")";

//reset the array
//$idstovals=array();

//try update the bulk
$update = mysqli_query($conn,$sql);

if(mysqli_error($conn))
    $error = mysqli_error($conn);
}

所有行都必须更新,有更好的解决方案吗?

问候。

3 个答案:

答案 0 :(得分:1)

当你连续写入相同的数据时,MySQL不会写这条记录

MariaDB [test]> create table r (id integer, PRIMARY KEY (id) );
Query OK, 0 rows affected (0.15 sec)

MariaDB [test]> insert into r (id)  VALUES (1),(2),(3);
Query OK, 3 rows affected (0.00 sec)
Records: 3  Duplicates: 0  Warnings: 0

MariaDB [test]> update r set id=4 where id=1;
Query OK, 1 row affected (0.01 sec)
Rows matched: 1  Changed: 1  Warnings: 0

MariaDB [test]> update r set id=4 where id=4;
Query OK, 0 rows affected (0.00 sec)
Rows matched: 1  Changed: 0  Warnings: 0

答案 1 :(得分:1)

这是我的第二次尝试,

UPDATE updatetable u 
LEFT JOIN reftable r ON r.id = p.id
SET u.updatecol = r.refcol
WHERE r.id_msg !=0 ;

或者您在PHP中使用id(PRIMARY KEY)和新值构建数组。所以你可以像这样

生成一个带有ON DUPLICATE KEY UPDATE的批量INSERT
INSERT INTO mytable (id,col) VALUES
(1,'new val'),
(2,'new val'),
(3,'new val'),
(66,'new val'),
...
(80000,'new val for id 80000')
ON DUPLICATE KEY UPDATE col= VALUES(col);

答案 2 :(得分:0)

很抱歉我迟到了,这是我的答案。 这是一个样本。我不测试它。 Column id_msg是PRIMARY KEY或具有唯一的INDEX是重要的。

//limit the select to 100
//Bulk update is very slow, is you set a limit very high server crash
$limit = 100;
$messages = array();

$updated_posts = 0;

$get_posts = mysqli_query($conn,
"SELECT id_msg, body FROM " . $to_prefix ."messages WHERE id_msg != 0");

while($row = mysqli_fetch_assoc($get_posts))
{
    $messages[] = "('".$row['id_msg'] ."','"
                    .mysqli_real_escape_string(fixBBCTags($row['body'])) ."')";
}

$sql_pre  = "INSERT INTO ".$to_prefix.".messages (id_msg,message) VALUES "
$sql_post = " ON DUPLICATE KEY message = VALUE(message)"

$sql = "";
$sep = "";
$cnt = 0;


foreach($messages as $msg)
{
    $sql = $sep . $msg;
    $sep = ",";

    if ($cnt++ % $limit) {
        mysqli_query($conn,$sql_pre . $sql . $sql_post);
        $pre = "";
    }
}

// Write the Rest if one
if ( $sql <> "" ) {
    mysqli_query($conn,$sql_pre . $sql . $sql_post);    
}