I'm having a really troublesome time trying to import a large CSV file into mysql on localhost.
The CSV is about 55 MB and has about 750,000 rows.
I've rewritten the script so that it parses the CSV and dumps the rows one by one.
Here's the code:
$row = 1;
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++)
{
$arr = explode('|', $data[$c]);
$postcode = mysql_real_escape_string($arr[1]);
$city_name = mysql_real_escape_string($arr[2]);
$city_slug = mysql_real_escape_string(toAscii($city_name));
$prov_name = mysql_real_escape_string($arr[3]);
$prov_slug = mysql_real_escape_string(toAscii($prov_name));
$prov_abbr = mysql_real_escape_string($arr[4]);
$lat = mysql_real_escape_string($arr[6]);
$lng = mysql_real_escape_string($arr[7]);
mysql_query("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)
values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng')") or die(mysql_error());
}
}
fclose($handle);
}
The problem is that it's taking forever to execute. Any suuggested solutions would be appreciated.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…