Love Fuel?    Donate

FuelPHP Forums

Ask your question about FuelPHP in the appropriate forum, or help others by answering their questions.
A proposal for caching ORM queries
  • I wrote a method for the ORM query class that I use to cache results from the database, the reason I prefer this to the Cache::call method is because A) I don't have to specify a name for cache, the name is a hash of the actual query. I find this helpful because I'll make the same dynamic complex queries from different areas in the app without having to explicitly enter a name. B) I like method chaining way more than array querying.
    public function get_cache($expiration = false) {
    
            $result = null;
            
            //hash the current query
            $md5_hash = md5($this->get_query());
    
            try {
    
                //see if cache exists
                $result = \Cache::get($md5_hash);
                
            } catch (\CacheNotFoundException $e) {
    
               //no cache found, execute query and set cache
                $result = $this->get();
                \Cache::set($md5_hash, $result, $expiration);
            }
            return $result;
        }
    

    usage as follows
    $comments= Model_Comment::find()
                    ->related('user')
                    ->limit(30)
                    ->get_cache(60);
    
  • i try this caching solution but memory usage and total time increases for me!
  • @bperin
    I like the basic idea but it might be better to make this a Query property and have get()/get_one() attempt cache before running the query. Which would give you a query like:
    $comments= Model_Comment::find()
                    ->related('user')
                    ->limit(30)
                    ->cache(60)
                    ->get()
    
    Also, if you want this: make the feature request on Github or (better) write it up and pull-request. We don't accept either feature reqeusts or bugs on the forums, that's what the issue tracker on Github is for. @hueman
    Both are easily explained: this needs to load the Cache library and needs filesystem access, both of which will add a little to memory and execution. But the point of caching isn't to speed up single-user usage, it is to make sure that your app stays scalable and fast with many users accessing at the same time. Databases are a lot more scaleable in ammount of data it can deal with, but mysql isn't very scaleable when it comes to simultanious access of many users. Which is where caching comes in: filesystem access (or even better: Memcached or Redis) is a lot faster to access and more scaleable in simultanious access.
    Also when dealing with big amounts of data this might be faster to load like this than to have hydration do its work after a database load.
  • Just a thought: MySQL can use query cache which would make ORM caching redundant. I am not sure this is the case here but it might explain longer loading times.
  • Jelmer Schreuder wrote on Wednesday 24th of August 2011:
    @hueman
    Both are easily explained: this needs to load the Cache library and needs filesystem access, both of which will add a little to memory and execution. But the point of caching isn't to speed up single-user usage, it is to make sure that your app stays scalable and fast with many users accessing at the same time. Databases are a lot more scaleable in ammount of data it can deal with, but mysql isn't very scaleable when it comes to simultanious access of many users. Which is where caching comes in: filesystem access (or even better: Memcached or Redis) is a lot faster to access and more scaleable in simultanious access.
    Also when dealing with big amounts of data this might be faster to load like this than to have hydration do its work after a database load.
    your explanation seems to be prudently
    but when i send several requests (eg: 1000 request) to the page uses this caching solution, memory usage and total time increases too
    this solution increases memory usage and total time for me, at least 30% whether webserver responses one request or more
    what is your idea?
  • but when i send several requests (eg: 1000 request) to the page uses this caching solution, memory usage and total time increases too
    this solution increases memory usage and total time for me, at least 30% whether webserver responses one request or more
    what is your idea?
    Mainly the idea is that it may increase a bit but remain almost constant when traffic increases. While without caching the mySQL response time might start slowing down immensely. I don't get how memory usage & total time could increase 30% unless maybe when you're testing on a clean installation without anything else going on, caching is not that heavy and neither are the files it creates - especially compared to any real work going on.
  • You can get a 30% increase if you add a single ORM call to a "hello world" controller, and then add caching to it, which needs to load extra classes and need some memory to process the result (Fuel reports peak memory used). The question is, how much of a real-world test is this? I agree with Jelmer, in a real life application, the increase would be well worth the performance gain. If not, there is no reason to start caching in the first place.
  • How do you go about profiling memory usage & speed?
  • I think the main idea with caching is consistent-predictable performance vs random performance.
    Using cache may increase memory usage, but over 1000+ connections that usage and rendering time will be marginally different. But without proper cache, you could find your database lagging behind with a flood of requests, which causes the resoruces to drain out, and make performance harder to predict and make the application unstable. Caching is not a "must" but the application will benefit from having it.
  • @jelmer
    i am not used this solution on a clean installation, i use this in a project with some tables in database and sample data (over 200,000 record), i am really busy this days and now i cant put my benchmarks results here, but i do this tomorrow
    i am really confused, your ideas about caching subject and this problem seems to be wisely, but in action something else happened for me @WanWizard
    i am agree with you about stupid Hello World! tests, i do my tests on a bigger and launchable project @bperin
    both fuel's profiler results and apache results are same @Dregond Rahl
    absolutely your are right
    i am never used query caching or any caching method caches db results, usually i cache rendered content (complete page or fragmented) when i use this snippet for caching in my project, i expect this caching method be efficient, at least like caching rendered content i dont test this caching method with more than 1000 request per second, incidentally one of my main questions is:
    for how many concurrent users, caching solutions (file-baseds or memory-baseds) is useful? tomorrow i put my results here
    hope to figure out what is the problem
  • Be careful that you don't fall into the trap of using ORM for all database interactions. ORM is great is you need object based access to individual records, for example for CRUD operations. ORM is definately a bad idea if you want to do an 'field=field+1' update on 200.000 records... Use the right tools for the job. Once you're sure you've done that, and your application is finished, locate the top-5 of performance black spots. Get rid of them by optimizing the hell out of the code, cache, etc. Repeat this cycle until you're happy with the performance. If this is a paid project, you also have to look at the cost/benefit ratio. You can get yourself a lot of server power these days for the fee of one week of development. Don't over-optimize.
  • Andrew Black wrote on Wednesday 7th of September 2011:
    Just a thought: MySQL can use query cache which would make ORM caching redundant. I am not sure this is the case here but it might explain longer loading times.

    Query cache helps, but this is more of a results cache, so a request isn't made until the results are expired or invalid.
  • I am not sure I made myself clear... What I meant is that MySQL query cache caches query results too. So if you have it enabled you have two competing cache mechanisms. Depending on memory available you could get slower performance if you use both of them. Huerman's approach seems correct to me - test the cache in real-time app to see if it helps.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion