@NogDogSep 17.2008 — #Whats a good way to reduce parse time? I have A LOT of PHP echos, includes and requires[/QUOTE] Use fewer echoes, includes and requires?
Seriously, there's no way for us to magically know how to improve invisible code. The most common places I see where overall speed can be improved is in the database design, database table indexing, and the SQL used to access it; but this may have nothing to do with your particular problem.
Check example 2 and the comments [/QUOTE] It is important to note that this benchmarking technique [B]does not[/B] measure [I]parse time[/I]. It measures [I]execution time[/I], as indicated by posted link (http://php.net/microtime). So, in other words, it measures everything that happens between the end of parsing (including "compiling") and the just before the end of execution.
So, the above technique is relevant if you're trying to get some measure of your script's [I]complexity[/I] or [I]execution time[/I] with respect to a particular set of data. For example, if your script receives [I]N[/I] objects from some form and you believe it runs in [I]N^2[/I] time, being able to ignore the [I]parse time[/I] and measure only [I]execution time[/I] will allow you to verify your analysis to some extent.
However, it's also important to mention that the total request time (from the start of parsing to the end of execution) of any single request is [I]mostly[/I] irrelevant. More importantly is how many times the script can be executed per second. For this, it's best to run an "external" benchmark [I]locally[/I]. So, run some URL-loading benchmark on the same server that the script is being loaded from.
If you're script receives fairly monotonous query data (or none), you can get a reasonable benchmark with the [I]ab[/I] (apache benchmark) command. Take a look: http://www.manpagez.com/man/8/ab/.
If you want a benchmarking tool that tests multiple URLs (multiple querystrings, for example), I think you'll either have to do some Googleing or write one (actually wouldn't be too hard to do).
Whats a good way to reduce parse time? I have A LOT of PHP echos, includes and requires[/QUOTE] Well, parse time is mostly just a function of how long the file is ...
In regards to [I]execution time[/I], in addition to what NogDog said, use require_[I]once[/I]() and include_[I]once[/I]() when possible. And if you can "functionalize" your includes, you'll be able to include each individual file a single time. And [I]in general[/I], simple code is faster code, though this isn't [I]always[/I] true.
@tfk11Sep 18.2008 — #Whats a good way to reduce parse time? I have A LOT of PHP echos, includes and requires[/QUOTE]
Avoid unnecessary string concatenation with your echos. eg [code=php] echo 'text', $more_text;
// rather than
echo 'text' . $more_text; [/code]
Don't bother going back and changing all your echos as it's unlikely the difference will be noticed.
As for the includes / requires... try to combine as many of these files as possible. The parse time of each file is probably quite small compared to the time needed to read each file from disk.
Using absolute file paths rather than relative paths also has some performance benefits.