让我们说我的表名为group with groups:
+----+---------+------+-------+ | id | user_id | name | color | +----+---------+------+-------+ | 1 | 1 | foo | green | | 2 | 1 | bar | red | | 3 | 1 | baz | | | 4 | 2 | baz | grey | | 5 | 3 | foo | blue | | 6 | 3 | baz | | | 7 | 4 | baz | brown | | 8 | 4 | foo | | | 9 | 4 | qux | black | +----+---------+------+-------+
我要读取一个csv文件并将其转换为这样的数据数组:
[
[
'user_id' => 2,
'name' => 'foo'
],
[
'user_id' => 2,
'name' => 'bar'
],
[
'user_id' => 2,
'name' => 'baz'
],
[
'user_id' => 2,
'name' => 'qux'
],
[
'user_id' => 2,
'name' => 'tux'
],
]
并仅将新数据插入数据库,并跳过数据库中已存在的数据,在此示例中为用户2的组baz
。
在Eloquent中有一些有用的方法,如firstOrCreate()
或findOrNew()
似乎是我需要的,但这些方法只适用于一条记录,如果我使用它们,我应该按文件运行它们线。
while($line = $this->decoder->decode($this->file) {
$row = Group::firstOrCreate($line);
}
运行较少的查询是否有更好的解决方案?
答案 0 :(得分:2)
使用INSERT ... ON DUPLICATE KEY UPDATE
INSERT INTO table (user_id,name) VALUES(1, "test") ON DUPLICATE KEY UPDATE name=VALUES(name)
对于Laravel检查this question
答案 1 :(得分:1)
您可以直接将文件加载到表中,并忽略重复的条目。您所需要的只是列上的唯一索引,它不应该有重复项。然后你做
LOAD DATA INFILE 'file_name.csv'
IGNORE
INTO TABLE tbl_name
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(user_id, @var)
SET name = do_some_voodoo(@var);
就是这样。阅读有关此命令的更多信息here。
<强>测试强>
/*sample data*/
shell> cat test.csv
Name|Value
Anna|hello
Ben |Test
East|whatever
Anna|This line should be ignored
Bob |
/*creating destination table in database*/
mysql> create table phpload(a varchar(50) primary key, b varchar(50));
Query OK, 0 rows affected (0.03 sec)
/*Test script*/
<?php
// Create connection
$con=mysqli_connect("localhost","root","r00t","playground");
// Check connection
if (mysqli_connect_errno()) {
echo "Failed to connect to MySQL: " . mysqli_connect_error();
}
$sql = "LOAD DATA LOCAL INFILE '/home/developer/test.csv' IGNORE INTO TABLE phpload FIELDS TERMINATED BY '|' (a, @b) SET b = UPPER(@b);";
$result = mysqli_query($con, $sql) or die(mysqli_error());
mysqli_close($con);
?>
/*executing*/
shell> php test.php
mysql> select * from phpload;
+------+----------+
| a | b |
+------+----------+
| Name | VALUE |
| Anna | HELLO |
| Ben | TEST |
| East | WHATEVER |
| Bob | |
+------+----------+
5 rows in set (0.00 sec)
绝对正常。