处理大量数据写入文件,躲避内存溢出和响应超时!
处理数据的方法很多,这里只是记录本人遇到内存溢出是的解决方法,有好建议的朋友可以随时留言!
方法一、不会出现内存溢出
a、使用where条件
$beginIndex = 0;
$pageLimit = 50000;
$endIndex = $beginIndex + $pageLimit;
while ($pageData = \DB::connection('hub_log')->select("SELECT * FROM `{$tableName}` WHERE id > {$beginIndex} and id <= {$endIndex}")) {
$arr = "";
foreach ($pageData as $row) {
$arr .= $row->Content . "\n";
}
file_put_contents(storage_path('mcd_rawdata/'.$tableName.'.txt'),$arr, FILE_APPEND);
$beginIndex += $pageLimit;
$endIndex += $pageLimit;
}
b、使用laravel框架中的chunk
$public_path = $time.'test.txt';
DB::table('20171220')->orderBy('id')->chunk(10000,function($data) use ($public_path){
$arr = '';
foreach ($data->toArray() as $key => $value) {
$arr .= $value->Content."\n";
}
file_put_contents(public_path($public_path),$arr, FILE_APPEND);
unset($data);unset($arr);
});
方法二、会出现内存溢出
public function getData($page = 1){
set_time_limit(0);
ini_set('memory_limit', -1);
$time = date('Ymd',time());
$public_path = $time.'.txt';
DB::connection()->disableQueryLog();
$data = DB::table('20171220')->select('Content')->where('id', '>', ($page-1)*50000)->limit(50000)->get();
if($data)
{
$arr = '';
foreach ($data as $key => $value) {
$arr .= json_encode(json_decode($value->Content,true))."\n";
}
if(!file_exists(public_path($time))){
mkdir($time,0777);
}
file_put_contents(public_path($public_path),$arr."\n", FILE_APPEND);
unset($data);unset($arr);
$this->getData(++$page);
}else{
dd('ok');
}
}
本作品采用《CC 协议》,转载必须注明作者和本文链接
方法二递归了,还没有释放对应的大变量,不溢出才怪
Query\Builder::chunk()
了解一下