I switched from SimpleAuth package to Warden authentication package (http://dre1080.github.com/warden), because I needed the extra functionality for my new project. I am seeing long execution time when trying to get users with Warden's user ORM model to feed the table. I am selecting around 3000 records to feed the table. Execution time is very long around 30 seconds (see the image below). This is the code I am using:
Before when I used SimpleAuth and my own user ORM model, selecting even larger amount of user records (cca 7k) was a breeze.
If I get the users with DB::query() or Query Builder insted of ORM seen above then the execution time drops down to 1s, which is acceptable. Thus for I am suspecting the Warden's user ORM model, but can't figure what would cause such a delay... If I execute the ORM's query directly on database it's fast, so there's no problem on DB side. Here is the content's of Warden's user model: https://github.com/dre1080/warden/blob/master/classes/warden/model/user.php
Any one knows why ORM causes such a long execution time here?
You should not use the ORM for batch operations, it's simply not suited for it. The main reason for this is that the ORM caches all retrieved objects, so even if the query time is acceptable, the memory usage will not be when loading large amounts of data.
I can't comment on the database model for Warden, as I don't use it myself. But given the size of the query in your screenshot I assume it's a massive join between several tables, which means the ORM has to hydrate the results, creating all these Model instances for the Models involved in the query. On my dev system, instantiating an object takes about 20ms with PHP 5.3.x, about 12ms with PHP 5.4.x.
Apart from this I wonder if it's useful to send 3000+ rows to a view...
Ok I understand that using ORM for listing table of records is not an ideal solution, but still there are some questions what's happening here. You mentioned massive joins but if you take a look closely to the executed user's SQL there are no joins how so ever. And query only takes 0.914ms. There's gotta be something else going one here.
>>Apart from this I wonder if it's useful to send 3000+ rows to a view...
Well why not? I am using JQuery datatables here instead of standard pagination I used before. It actually performs really good, if I exclude the issue above. Anyway I will use Query Builder for geting the records and passing them to view, but I would still like to understand what's happening inside ORM here that takes so much time to execute entire thing.
>>Apart from this I wonder if it's useful to send 3000+ rows to a view...
I am also opened to any performance suggestions here, but you need to know I am using jQuery datatables, as this is requirement from my customer. I know I can use Ajax source with jQuery datatables, but I need to test it how that comes together.
As said, I don't use it, so I don't know what Warden does exactly. If it's a simple query on a single table, they only reason for the time difference could be object instantiation.
As a test, you can run the same query using DB::query() twice, one with as_array() to produce the result as an array, and one with as_object() to create stdClass objects. You can use as_object('ORM-model-name') too to have the DB create ORM objects.
See if you can find a significant difference between the three.