我想回显300000个大型数据集中的记录。
比unset($ data)返回前5000条记录,并迭代直到mysql表中的记录结束。
这样的事情,
1)
for ($i=0; $i < 5; $i++) {
$data = openssl_random_pseudo_bytes(1000000);
echo "peak_memory_usage = ” . memory_get_peak_usage(true) . “\n”;
doSomething($data);
//unset($data);
}
echo “for loop completed, memory now at ” . memory_get_usage(true) . “\n”;
function doSomething($data) {
echo “size:” . strlen($data) . “\n”;
}
还是类似的东西?
2)
nRows = $pdo->query('select count(*) from employees')->fetchColumn();
$users = new ArrayIterator(range(1, nRows)); // nRows are 3000000 test records
foreach(new LimitIterator($users, 0, 50000) as $u) {
echo $u, "\n";
}
OR
3)@Sameer您想在下面的查询中添加您的建议,我可能在添加usleep-my编码缺陷时做错了事,这会导致添加usleep时出现超时问题。
$data = $DB->query("SELECT * FROM user_details")->fetchAll();
foreach ($data as $row) {
echo $row['username']." -- ID :" .$row['user_id']. " -- FirstName :" .$row['first_name']. "<br />\n";
}
第三个3)选项可以很好地运行50,000条记录,但RAM上的负载并不多,但CPU上有多少负载?有没有一种方法可以对此进行优化以减少CPU上的负载,假设如果30个人运行相同的查询,它将使CPU满负荷运行?如果我添加usleep(10)-它会回显记录,但最后会提示超时提示错误。
请高度赞赏任何建议。
非常感谢您阅读我的文章。
我偶然发现了(Dm4Web)数据加载的惊人解决方案-惊人的解决方案-但需要添加HTML表/追加并附加结果。
<!DOCTYPE html>
<html>
<head>
<title>SQL Batch List AJAX and jQuery</title>
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.11.0/jquery.min.js"></script>
</head>
<body>
<div id="mainform">
<h2>Fetch REcords 5000 at a time</h2>
<div id="listData">
<div>
<input id="load" name="load" type="button" value ="Load Data">
<input id="cancel" name="cancel" type="button" value ="Cancel">
</div>
</div>
</div>
</body>
<script>
// counter that allows you to get a new set of rows
var step = 0;
// set variable if you want to restrict the number of rows will be loaded
var maxStep = 0;//
// how many rows should be returned
var count = 5000;
// if the cancel button is pressed
var cancel = false;
$(function() {
$('#load').click(function(){
getData();
})
$('#cancel').click(function(){
cancel = true;
})
});
function getData()
{
step++;
//If cancel variable is set to true stop new calls
if(cancel == true) return;
// checks if the variable is set and limits how many rows to be fetched
if(maxStep >0 $$ step >= maxStep)
$.post('ajax.php'
,{
'step':step,
'count':count,
}
,function(data, textStatus, jqXHR){
if(textStatus == "success")
alert("Data: " + data);
/* foreach (data as $row) {
echo $row['username']." -- ID :" .$row['user_id']. " -- FirstName :" .$row['first_name']. "<br />\n";
} */
if(textStatus == "error")
alert("Error: " + jqXHR.status + ": " + jqXHR.statusText);
// when it finishes processing the data, call back function
getData();
}
,'json'
)
}
</script>
</html>
==== ajax.php =====
step = 0;
if(isset($_POST['step'])) $step = (int)$_POST['step'];
$count = 0;
if(isset($_POST['count'])) $count = (int)$_POST['count'];
if($step>0 and $count>0)
{
$offset = ($step-1) * $count;
$limit = $offset.','.$count;
// --------------
// your code here
// --------------
$data = $DB->query("SELECT * FROM user_details LIMIT .$limit")->fetchAll();
$result = mysql_query($sql);
$arr_result = array();
foreach ($data as $row) {
$arr_result[] = $row;
}
$arr_result_enc = json_encode($arr_result);
echo $arr_result_enc;
// echo rows
//echo json_encode($rows);
}
方法4)
$query = "SELECT COUNT(*) as num FROM employees";
//$select_run = mysqli_query($conn, $select);
$result = mysqli_query($conn, $query) or die(mysql_error());
$row = mysqli_fetch_array($result);
$itemcount = $row['num']; // Roughly 300,000 total items
$batches = $itemcount / 2000; // Number of while-loop calls - around 120.
for ($i = 0; $i <= $batches; $i++) {
$offset = $i * 2000; // MySQL Limit offset number
$query = "SELECT first_name,last_name FROM employees LIMIT 2500, $offset ";
$result = mysqli_query($conn,$query) or die(mysqli_error($conn));
while ($row = mysqli_fetch_array($result)) {
echo $row['first_name'];
}
echo "<BR>";
echo "Run Number: ".$i."<br />";
echo "<BR>";
}
答案 0 :(得分:1)
$data
已经被覆盖,所以这里不是问题。
沉重的循环会在服务器上产生恒定的张力,从而增加负载。
您可以添加几微秒的sleep
来释放服务器资源和呼吸时间,这将降低服务器负载。使用usleep并设置最佳微秒。
for ($i=0; $i < 5; $i++) {
usleep(100);
$data = openssl_random_pseudo_bytes(1000000);
}
答案 1 :(得分:0)
解决您第三次尝试的想法: