I have a php script running and using cURL to retrieve the content of webpages on which I would like to check for the presence of some text.
Right now it looks like this:
for( $i = 0; $i < $num_target; $i++ ) {
$ch = curl_init();
$timeout = 10;
curl_setopt ($ch, CURLOPT_URL,$target[$i]);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt ($ch, CURLOPT_FORBID_REUSE, true);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$url = curl_exec ($ch);
curl_close($ch);
if (preg_match($text,$url,$match)) {
$match[$i] = $match;
echo "text" . $text . " found in URL: " . $url . ": " . $match .;
} else {
$match[$i] = $match;
echo "text" . $text . " not found in URL: " . $url . ": no match";
}
}
I was wondering if I could use a special cURL setup that makes it faster ( I looked in the php manual chose the options that seemed the best to me but I may have neglected some that could increase the speed and performance of the script).
I was then wondering if using cgi, Perl or python (or another solution) could be faster than php.
Thank you in advance for any help / advice / suggestion.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…