How would you efficiently extract data from a csv file which is several gigabytes in size?
[5851,5853]
Correct:
--> fopen() and fgetcsv()
--> fopen(), fgets() and explode()
Either of these solutions would work pretty well – for a very large file, loading it into memory with file_get_contents or something similar would make PHP run out of memory (or at least use up a lot of it!) so it's better to use a file-pointer-based approach and do it a line at a time.