Home Posts GitHub Menu

Laravel Response Caching


Published Apr 10, 2020
Laravel PHP

Today I spent some time working on caching responses with Laravel in an attempt to improve response times. Below are some findings from my experimentation.

First thing to note is that we are attempting to cache the entire response here. This is different from caching the vanilla php version of a blade view.

Second thing to note is that this solution is aimed at more trivial use cases. For instance caching pages on a blog such as this one which all return basic HTML. In these cases the number of records/pages will be very small. We will also not require any elaborate cache clearing strategy aside from using the built in cache:clear command.

The goal here is to return some response based on a url.

So for instance if we visit /articles/my-awesome-article the request will not even get past the middleware layer into any controllers or view logic. If it finds a hit in the cache, it returns that right away.

Sample Code

<?php

namespace App\Http\Middleware;

use Closure;
use Illuminate\Support\Str;
use Illuminate\Support\Facades\Cache;

class ViewCache
{
    public function handle($req, Closure $next)
    {
        // Check if view caching is enabled.
        if (config('app.view_cache')) {

            // parse the path into a key/slug.
            $path = $req->getPathInfo();
            $path = str_replace('/', '-', $path);
            $path = Str::slug($path);
            $path = empty($path) ? 'index' : $path;

            // Check for a hit
            if (Cache::has($path)) {
                $res = Cache::get($path);

                // If we found a 302 return that.
                if (substr($res, 0, 4) === '302:') {
                    return redirect()->to(substr($res, 4));
                }

                // Return the raw HTML.
                return $res;
            }

            // Handle a miss.
            else {
                $res = $next($req);
                $status = $res->status();

                // For 200 we can store the results.
                if ($status === 200) {
                    Cache::put($path, $res->getContent());
                }

                // For a 302 we will store the redirect to avoid having
                // to parse out on each request in other parts of the code.
                elseif ($status === 302) {
                    Cache::put($path, '302:' . $res->getTargetUrl());
                }

                return $res;
            }
        }

        return $next($req);
    }
}

How Fast Is it?

After playing around with it for a while the ultimate conclusion was "not much faster".

This was tested with no cache vs file caching and redis caching.

Although it did shave off about 10ms on the total response time, I was expecting a bit more of a gain here. The majority of time is still spent on the request round trip itself. Getting response times down to < 100ms levels will likely require some form of CDN.

Additional Notes

There are a few additional things to consider with this basic approach.

Clearing The Cache

As mentioned above there is no elaborate cache clearing strategy here. The simplest approach is to simply call cache:clear after a deploy to pick up any updates.

Two other things to consider:

Incrementing Views

Although we will likely use some trackers like Google Analytics, using a response cache eliminates possibility for local tracking. For instance a simple view count for popularity rankings.

Other Approaches

This of course is a very low level and simple approach that may not result in huge gains. Some other things to potentially look into are listed below.

The spatie/laravel-responsecache package

This package also seems promising and for the most part should yield similar results. It offers a lot more options and is worth potentially looking into.

Web Server Caching

Of course caching can be done at many levels including the web server. Will not get into it here since it's out of the scope of this article. However, definitely something to consider looking into as well.