The Setup
A PHP Backend
In order to put a Redis layer, between Frontend and Backend, we need a backend first. For the sake of simplicity I will create a simple API with PHP which returns a simple JSON object that is about 1MB in size:
index.phprequire __DIR__ . '/../vendor/autoload.php'; $dummy = str_repeat( 'Lorem ipsum dolor sit amet, consectetur adipiscing elit. ', (int) (1024 * 1024) / 56 ); app()->get('/', function () use ($dummy) { response()->json([ 'message' => $dummy ]); }); app()->run();
This is just to create an example response. The snippet above uses LeafPHP. That said in other PHP Frameworks it would look quite similar. Of course, we are talking about a headless website here, so this could be anything from a JSON API, a GraphQL endpoint, Inertia.js controllers or a minimal controller action returning an HTML fragment.
Next.js Frontend
To scaffold a minimum Next.js App we use the create-next-app CLI tool:
npx create-next-app@latest app
Which we can then start with `yarn dev` and open its URL in the browser at http://localhost:3000.
Redis Caching Layer
Depending on your operating system and preferences installing Redis can be done in different ways. DDEV, brew, etc. we don’t want to conver the installation here.
But how do we add Redis support to our Next.js app?
yarn add ioredis
# npm install ioredis
Now let’s add a redis.ts so that ioredis knows how to connect to Redis.
lib/redis.tsimport Redis from 'ioredis'; process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0"; export const redis = new Redis(process.env.REDIS_URL ?? 'redis://localhost:6379');
Make sure to adjust the host and port to your needs. You can do that in an .env file, too.
Now let’s add a custom fetch method, that makes use of our freshly added Redis Cache.
lib/fetchWithRedis.tsimport { redis } from './redis'; function getEtagCacheKey(url: string): string { return `etag:${url}`; } function getDataCacheKey(url: string, etag: string | null): string { return etag ? `data:${url}:${etag}` : `data:${url}:no-etag`; } export async function fetchWithRedis(url: string): Promise<unknown> { const etagCacheKey = getEtagCacheKey(url); const etag: string | null = await redis.get(etagCacheKey); const headers: HeadersInit = {}; if (etag) { headers['If-None-Match'] = etag; } const res: Response = await fetch(url, { headers, cache: etag ? 'no-store' : 'force-cache' /* disable Next.js cache if we have an ETag */ }); if (res.status === 304) { const cacheKey = getDataCacheKey(url, etag); /* url and last etag as cache key */ const cached: string | null = await redis.get(cacheKey); if (!cached) { throw new Error('304 received but no cached data found'); } return JSON.parse(cached); } /* resource unchanged, use from cache */ if (!res.ok) { throw new Error(`Upstream error: ${res.status}`); } /* first fetch or etag changed */ const data: unknown = await res.json(); const newEtag: string | null = res.headers.get('etag'); /* store etag and data to redis */ const cacheKey = getDataCacheKey(url, newEtag); await redis.set(cacheKey, JSON.stringify(data)); if (newEtag) { await redis.set(etagCacheKey, newEtag); } return data; }
Let’s break down the main fetchWithRedis function. What happens when we request a URL url.
- we look up a stored Etag value in Redis that was stored with the key
etag:<url> - we then fire up a classic fetch(url). If we had found a previously stored Etag, we also pass a
If-None-Matchheader. What this does is, it basically indicates to the Backend server, that we already know what the Reponse will be for a given Etag. This allows the the Backend to do one of two things: it can either decide: “hey, the content hasn’t changed. Use what you cached (HTTP Status 304)” or “The content has changed, I have a completely fresh Response for you (with a new Etag).”
We also instruct Next.js not to cache the response in its own cache. - If the backend gave us a Status 304, we then get a second cache key from Redis:
data:<url>:<etag>which holds our previously cached response. In this case we do an early return, there’s nothing more to do. - If we had received a fresh Response (with Status 2xx) though, we need to update its Etag and store the new Response to a new cache key.
That’s basically it. You might have noticed the use of Etag here. If you’re not familiar with these, the technique is called Conditional Requests. Read more about it here:
- https://developer.mozilla.org/en-US/docs/Web/HTTP/Guides/Conditional_requests
- https://www.azion.com/en/learning/performance/what-are-conditional-requests/
- https://http.dev/conditional-requests
In a nutshell: it’s what browser caches do internally all the time, in order to reduce traffic, bandwidth and protect server resources. If an Etag and/or Last-Modified-Date is set on a Response, Browsers remember these. When there a subsequent requests to a known URL browsers send If-Modified-Since and If-None-Match headers with the Request. If the server responds with a 304 Status, the already cached version is used.
Usage in a Next.js page
app/page.tsximport { fetchWithRedis } from '@/lib/fetchWithRedis'; export default async function TestPage() { const data = await fetchWithRedis('http://localhost:5055/'); return ( <main style={{ padding: 24 }}> <h1>Test page Conditional GET with Redis Cache</h1> <pre>{JSON.stringify(data, null, 2)}</pre> </main> ); }
Here we simply use our new fetch method to request a backend URL and output the Response to the browser. Make sure to adjust the URL to a valid URL (wherever your backend or test backend actually responds).
Now for this to work as expected it’s essential, we need to make sure our Backend Repsonse handles Etags. Let’s modify the Route:
index.php... app()->get('/', function () use ($dummy) { $lastModified=filemtime(__FILE__); /* or any other mechanism to determine freshness */ $etagFile = md5_file(__FILE__); /* or any other mechanism to generate unique hash */ $ifModifiedSince = isset($_SERVER['HTTP_IF_MODIFIED_SINCE']) ? $_SERVER['HTTP_IF_MODIFIED_SINCE'] : false; $ifModifiedSinceTime = $ifModifiedSince ? @strtotime($ifModifiedSince) : false; $etagHeader = isset($_SERVER['HTTP_IF_NONE_MATCH']) ? trim($_SERVER['HTTP_IF_NONE_MATCH'], '"') : false; if ( ($ifModifiedSinceTime !== false && $ifModifiedSinceTime >= $lastModified) || ($etagHeader && $etagHeader === $etagFile) ) { response() ->status(304) ->sendHeaders(); exit; } response() ->withHeader('ETag', $etagFile) ->withHeader('Last-Modified', gmdate("D, d M Y H:i:s", $lastModified)." GMT") ->withHeader('Cache-Control', 'no-cache') ->json(['message' => $dummy]); });
So this is the counterpart to what we did earlier in the fetchWithRedis.ts file.
While servers like Traefik or Nginx can add Etags out-of-the-box – at least for static assets such as files and images, we want to control the freshness of dynamic PHP responses as well, wether its a JSON API Response, GraphQL Response, HTML Fragments or something else.
Benchmarks
So how does Redis perform in comparison to Next.js built-in caches? Using hey I fired up 200 requests against my Next.js endpoint with a concurrency of 2.
I get the following results:
- Without Next.js caching: Total: 29.7909 secs, Requests/sec: 6.7135
- With Next.js caching: Total: 13.2674 secs, Requests/sec: 15.0745
- With Redis caching: Total: 10.3989 secs, Requests/sec: 19.2328
So in comparison to Next.js default caching mechanisms, we get quite a substantial 27% boost in performance which is somewhat surprising as both caches should be in-memory.
Conclusion
I see two main benefits of using Redis to cache Backend Responses:
- Improved Performance – a 27% boost in performance is quite substantial.
- Moving the Freshness Control from the Next.js Frontend to a PHP Backend gives me more control. Especially if the API is driven by a CMS and a database, it’s actually not that complicated to determine a last modification date. But more importantly, the freshness control is now where the data is administered and edited, in the backend.
That said, this is a basic test, without addressing the different request methods such as POST (form submissions, etc) or handling authentication, cookies / sessions.