Menu
I’m processing large files via PHP and I have some concerns regarding optimization of memory usage in the script.
For example, if I trim() a 300 MB file does PHP know to stream it internally, or does it actually load the whole thing into memory to process it?
Is any optimization achieved by doing:
$data = file_get_contents(<300 MB file>);
$trimmed = trim($data);
Versus:
$trimmed = trim(file_get_contents(<300 MB file>));
I have similar concerns for mdecrypt_generic() and other functions that could potentially process large amount of data, but don’t seem to have an explicit way to stream.
Please enlighten me, and thanks in advance.