是否可以通过curl保存所有网站内容以供离线使用?
基本上获取所有HTML,CSS,js,图片,音频,flash等...
<?php
$curl = curl_init();
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_USERAGENT,'',
CURLOPT_URL => 'http://edition.cnn.com/'
));
$resp = curl_exec($curl);
echo $resp;
curl_close($curl);
?>
我收回了网站内容,但如何将其保存作为单独的文件?
谢谢!
答案 0 :(得分:1)
<?php
include_once('simplehtmldom/simple_html_dom.php');
//set up curl
$ch = curl_init('http://example.com');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$curl_html = curl_exec($ch); //use curl to get data from example.com
//use simplehtmldom to parse the site into a dom-like object
$html = str_get_html($curl_html);
echo $html;
?>