I will give you a solution to use if you have big CVSs and you don't want to use much of your machine's RAM (imagine each CSV is 1GB, for example).
<?php
function joinFiles(array $files, $result) {
if(!is_array($files)) {
throw new Exception('`$files` must be an array');
}
$wH = fopen($result, "w+");
foreach($files as $file) {
$fh = fopen($file, "r");
while(!feof($fh)) {
fwrite($wH, fgets($fh));
}
fclose($fh);
unset($fh);
fwrite($wH, "
"); //usually last line doesn't have a newline
}
fclose($wH);
unset($wH);
}
Usage:
<?php
joinFiles(array('join1.csv', 'join2.csv'), 'join3.csv');
Fun fact:
I just used this to concat 2 CSV files of ~500,000 lines each. It took around 5seconds and used 512kb of memory.
Logic:
Open each file, read one line and then write it to the output file. Yes, it may be slower writing each line rather than writing a whole buffer, but this allows the usage of heavy files while being gentle on the memory of the machine.
At any point, you are safe because the script only reads on line at a time and then writes it.
Enjoy!
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…