I wrote a PHP script which runs through a text file (actually it's a 'list' file from imdb) and stores it on my local MySQL database.
public static function updateMovies( $list ) {
$offset = 15; // movies.list start with movie names at line 16
$handle = fopen($list, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle);
if($offset!=0)
$offset--;
else
if($buffer[0] != '"'){
$title = trim( substr( $buffer, 0, strpos( $buffer, '(' ) ) );
$year = intval(trim( substr( $buffer, strpos( $buffer,'(' )+1, 4 ) ));
Movie::create( $title, $year );
}
}
fclose($handle);
}
}
Since those list-files are up to 200MB it takes a lot of time. By Default PHP's MAX_EXECUTION_TIME
is set to 30 seconds.
I set this value to 300 just to try if it works. For example, my 'movies.list' file is around 80MB and using this script for 300 seconds created around 25000 lines in my database. This doesn't work because I have not even reached the movies starting with 'B'.
I know I can set the MAX_EXECUTION_TIME
to 0 (unlimited) but in the future I don't want this database to be on my localhost. I want it on my webserver and my webserver hosts MAX_EXECUTION_TIME
is set to 90 as far as I know.
Any ideas how you would handle this?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…