NOTE: I know I can put View::make('hello')->render() outside the loop and call it only once. The example is a toy example but when I use some libraries like https://github.com/sleeping-owl/admin sometimes a lot of views are called so is interesting knowing in any alternative way.
How to improve performance when calling same view multiple times
Imagine a code like:
Route::get('/testperformance', function () {
$ouputString = "";
for ($i = 1; $i <= 10000; $i++) {
$ouputString = $ouputString . View::make('hello')->render();
}
echo $ouputString;
});
You can test how performance is linear affected by incrementing loop size (for example using 20000 tooks double time to execute page http://myapp.app//testperformance ).
The question is: any way to increment performance in this case?
I think the problem is time spent in I/O disk operations reading the view file. Taking into account is always the same field it could not be cached in any way?
I haven't tested personally, but most posted results show a for loop taking twice as long to execute vs a for each.
for is quicker than foreach but we are talking about a ten thousandths of a difference over the whole 10,000 not each loop.
In this case its getting view files a fuck-of-a-lot of times. When most apps have ~1-5 views being loaded per page this seems to be looping for each row turning it into thousands.
@acacha I'd expect it to be linear.
Forgetting Laravel for a moment if you did this I'd expect it to be 10s alone from 10,000 calls.
$start = microtime(true);
for ($i = 1; $i <= 10000; $i++) {
file_get_contents('your-file');
}
echo microtime(true) - $start;
(It's actually calling file_get_contents behind the scenes when you View::make)
So the issue is why is it calling the same views over and over. That seems to be a bug in the Sleeping Owl to me.
Whilst caching will help, thats covering the actual issue. The Application load should be in ms. If a page is taking more than 1 second to load its a sure sign something is wrong.
As you say if you have something where the view that's being loaded is outside the loop.
That 10 seconds for 10,000 turns into 0.001 seconds for 1.
$start = microtime(true);
$file = file_get_contents('your-file');
for ($i = 1; $i <= 10000; $i++) {
echo $file;
}
echo microtime(true) - $start;
Please or to participate in this conversation.
