Love Fuel?    Donate

FuelPHP Forums

Ask your question about FuelPHP in the appropriate forum, or help others by answering their questions.
Full Page Caching
  • I was just wondering how people around here perform full page caching especially with respect to user-specific content like a user's blog posts or comments or photo gallery.

    Is it a good way to save the file with a file name generated by any type of hash e.g., md5 of the requested URI, the current user's ID, and maybe further request data (such as \Input::param())? For subsequent requests then we would do an automatic search for a file name generated by the same combination of data, and perform that in \Request::execute() (that's probably the best place) to send that cached content rather than executing the controller's method again?
  • Those are mutually exclusive.

    Either you cache the full page (and return it on a hit without running any code), or you have dynamic data. You can't have both.

    We use a system where widgets are generated by HMVC calls to module controllers, and each controller deals with caching internally. It knows best what is cachable and what not, and how to cache. For the application and the theming system, caching is completely transparent.

    Obviously this means that still quite a bit of code runs, compared to full page caching (like Wordpress' super-cache system), but it allows for very granular control.

  • Thanks for your response, @Harro Verton.

    I've written my own widget system which currently only deals with the menu(s) and dashboard widgets. The only caching mechanism so far is which modules provide a menu section or a dashboard widget. I haven't yet considered implementing caching on the far server side i.e., on each controller running a certain action, but only thought of full-page caching. Or maybe even cache query results or ORM models.
    The latter isn't implemented using \Cache but implemented internally inside \Orm\Query and does not cache between requests but only within one request, correct?
  • To start with your last remark, that is correct. And Orm caching is not designed for performance, but for consistency. It ensures that when you query the same record multiple times, you get the same object returned, so a record can not exist in multiple states.

    In general, if you want to start optimizing, apply the 80-20 rule. What pieces of code use up the bulk of the response time of your web application? Use the bult-in profiler, or "cachegrind" to find out. In general, this would be your (database) I/O. So it would make sense to optimize that, and don't look at global caching if that only shaves off a few milliseconds.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion