Understanding Rendering in the Jamstack

SHARE

Understanding Rendering in the Jamstack

SHARE

- 9 min to read

Understanding Rendering in the Jamstack

Rendering in the Jamstack? From client-side to server-side rendering, and even distributed persistent rendering - here’s what sets Jamstack apart.

By Brian Rinaldi

Most articles about the Jamstack say that what sets the Jamstack apart is the JAM: JavaScript, APIs, and Markup. The thing is though, almost every modern site on the web today uses all three of those things. Even if you are new to the Jamstack, I’m fairly confident that you are comfortable with JavaScript, APIs, and markup.

On the other hand, adopting Jamstack often requires a shift in mental models from a traditional server-rendered web app, where every user request is handled by the server and either rendered server-side or passes server-side data to a SPA for rendering. In either case, the developer does not have to consider when the page will be rendered. As I first touched on over a year ago in Thinking in Jamstack, this isn’t the case with Jamstack. Suddenly, you have to consider when a page – or even part of a page – should be rendered.

With the recent addition of server-side rendering and deferred rendering methods in many static site generators (SSGs), this has gotten even more complicated. This article aims to help you grasp the different options for rendering in the Jamstack and make the best choices throughout your application.

For the sake of illustration, all of the code samples in this post use Next.js, but the concepts are relevant to other frameworks including Gatsby 4, Nuxt 3 and others.

Background

I got my start in web development way back in 1997 and, for the longest time, I worked as a web developer and never really thought about “rendering.” Back then, you could create a page was either straight HTML with sprinklings of JavaScript to do trivial things (remember MM_swapImage()?), in which case the page wasn’t really rendered in any meaningful sense, or the page was dynamic and rendered by the application server upon every request by every user. We call the latter SSR (server-side rendering) nowadays because everything old is new again.

The mental model of a traditional monolithic, server-rendered application circa 1997 was simple:

  1. A request comes in from a user to the web server.
  2. The web server routes this to the application server.
  3. The application server does whatever database calls we need to get data and assemble the page as HTML (CSS was technically a thing but no browser supported it fully until 2002).
  4. The rendered HTML would be sent by the web server in response to the request.

Static site generators (SSG), which started gaining popularity with Jekyll in 2008, changed this up a bit. While the end result was static HTML in much the same way as some of the oldest web pages, SSGs added a dynamic build process that brought together a templating language like Liquid, markup languages like Markdown and other tools like CSS prepocessors to render pages prior to deploying them. With the addition of services like Netlify and GitHub pages, this dynamic build process didn’t even happen on my local machine but on build servers and the result was deployed automatically upon every change checked into the git repository.

The term Jamstack came about in 2015 in part because developers started doing some very interesting things with so-called “static sites.” They were leveraging JavaScript in the browser and APIs on the server to render aspects of the page on the client (aka client-side rendering), making static pages that behaved dynamically. The Jamstack existed like this happily for years.

And then…

Server-side rendering in the Jamstack?

Next.js changed the way Jamstack developers approach building applications. While Gatsby had already made Jamstack developers comfortable using React, Next.js changed the way we approach rendering.

Next.js didn’t just support both SSR or SSG rendering, but also the ability to make that choice for each route. For example, /about could be static but /cart could be server-rendered, even though they existed on the same site.

To define static routes, you’d use a combination of getStaticProps(), which defines the data for your page, and getStaticPaths(), which defines the routes that will be rendered in cases where you are using dynamic routes. This might look like the following:

export async function getStaticProps() {
  const content = await import(`../content/about.md`)
  const data = matter(content.default)
  return {
    props: {
      frontmatter: data.data,
      markdownBody: data.content
    }
  }
}

On the other hand, for a SSR route, you’d use getServerSideProps(), which might look like:

export async function getServerSideProps({ query }) {
  const tag = query.tag
  const response = await fetch(`https://dev.to/api/articles?tag=${tag}`)
  const data = await response.json()
  return {
    props: {
      posts: data
    }
  }
}

The serverless deployment feature in Next.js (starting with Next.js 8) and other tools like it mean that my getServerSideProps() is deployed as a serverless function.

For instance, if I were to deploy my Next.js site to Netlify, the contents of this function are automatically deployed as a Netlify Function – effectively an AWS Lambda. When a user requests this SSR route, the Lambda function will be called to get the data required to render the page. This opens up the ability to deploy an SSR site using an SSF to a traditionally static hosting platform.

So a site with SSR is Jamstack?

Yes. Well, maybe. The ability to have SSR routes in a Jamstack site deployed to a CDN has caused a lot of soul searching in a Jamstack community that was comfortable speaking about the benefits of static. Is it a site where all the routes are SSR but is deployed to a CDN a Jamstack site? If not, how much SSR, if any, is acceptable for a site to be called Jamstack?

The truth is that, beyond an esoteric discussion about the principles of Jamstack, it doesn’t matter in a practical sense. Choose what is right for the needs of your application. We’ll talk a bit about some guidelines that can help guide your choices in a moment, but, before we can get to that, Next.js had another innovation up its sleeve …

ISR, DPR, DSG, WTF!

Starting in version 9.4, Next.js added a new type of rendering they called Incremental Static Regeneration or ISR. An ISR page looks almost identical to an SSG page, including functions like getStaticPaths() and getStaticProps().

In the case of ISR, however, not every static path is prerendered. For example, I might have 1,000 products on my /store/[slug].js route, but I only render 100 of them at build time. The other 900 would get rendered the first time they are requested by a user and cached as static. Subsequent users will receive this static version of the page until the site gets rebuilt or until its revalidation time has passed, if I have specified one.

You can create an ISR route by defining a fallback property when returning the paths from getStaticPaths().

export async function getStaticPaths() {
  // only grab the first 5 to prerender
  const top = 5
  const response = await fetch(
    `https://dev.to/api/articles?username=remotesynth&page=1&per_page=${top}`
  )
  const data = await response.json()

  const paths = data.map((post) => {
    let username = post.organization
      ? post.organization.username
      : post.user.username
    let slug = `/dpr/${username}/${post.slug}`
    return slug
  })

  return {
    paths: paths,
    fallback: 'blocking'
  }
}

There are two values for fallback: blocking and true.

  1. In the case of blocking, Next.js will call the server when a page is first requested by a user and wait for the response before rendering the page. Subsequent calls will be returned from the static cache.
  2. With fallback set to true however, Next.js will serve up a static page in a loading state first. Once the data is retrieved from the server, the page will be re-rendered. Subsequent calls will also be returned from the static cache.

ISR is primarily a solution for very large sites that allows them to dramatically reduce their build times by prerendering the critical at build time and the less critical (perhaps less trafficked) pages when they are first requested. In some cases, ISR can also be used to serve dynamic or user-generated content, effectively serving as a heavily-cached SSR route.

Distributed Persistent Rendering

Netlify had some philosophical disagreements over ISR, particularly the ability to set a revalidation time. Their concern was that this breaks their principle of immutable atomic deploys. In the case of ISR, the deploy could not only change, but could also be continuously changing. In their view, this could lead to hard-to-debug issues while also preventing you from confidently rolling back to a deploy because the state of that prior deploy might be unknowable.

Instead, they released a proposal for Distributed Persistent Rendering (DPR). The key difference between ISR and DPR from a practical standpoint was that any page that gets rendered after the initial build will become a permanent part of that build. The only way to re-render that page would be to trigger a new build.

While you can create a page with DPR using Netlify’s on-demand builders, deploying a Next.js site that has an ISR route without a revalidation time will automatically be deployed as DPR. If your ISR route has a revalidation time, Netlify just announced support for revalidate in the latest Next.js on Netlify plugin.

Deferred Static Generation

Perhaps not wanting to be left out of the acronym party, Gatsby introduced Deferred Static Generation (DSG) in Gatsby 4. While I think it’s a better name the prior ones, it is essentially Gatsby’s implementation of Netlify’s DPR proposal.

Can we just call it deferred rendering?

Personally, I prefer to call all three types of rendering “deferred rendering” because I think it is simpler (no acronym!) and more accurately describes what’s going on. The rendering of the page is deferred until it is first requested by a user. Yes, it papers over the difference in implementation between ISR and DPR/DSG, but I believe that it conveys the broader pattern that all three share while not getting lost in the implementation details.

When to use what?

For any technology, the hardest part is not establishing simplicity, but protecting it over time.

Matt Biilmann, CEO of Netlify

Indeed, this has gotten complex and perhaps confusing, so let’s quickly review the types of rendering and the differences:

  • static rendering – rendered once during the build.
  • server-side rendering – rendered on every user request.
  • deferred rendering – rendered once upon first user request (or, in the case of ISR, after invalidation).

So when should you use each one? Here are my recommendations:

Begin with a “static first” philosophy when building a Jamstack site. As Jamstack developers, we believe in the speed and security benefits of prerendering assets. So, ask yourself: “Can this content be prerendered?” If it can, then it probably should be, even if it requires a bit more effort than the alternative SSR implementation.

Remember that it doesn’t have to be all or nothing, you can prerender portions of your site and not others, or even portions of a page and not others. For example, I could have a component on an otherwise static page that is populated client-side by the results of a server-side API call. Next.js has a more advanced form of this using React hooks via their swr plugin.

Use deferred rendering when you have a lot of pages. While it can dramatically improve your build time without much penalty, you probably won’t need it unless your page count reaches the thousands at a minimum and there is a slight penalty for the first user to request the page while it renders.

You can sometimes use deferred rendering for personalized or user-generated content in some cases. For example, Phil Hawksworth of Netlify discussed how this would work using Netlify’s On-demand Builders here. In a similar scenario, this may offer performance benefits similar to straight static rendering (with the exception of the first user to hit it), while still allowing user-generated content.

Use SSR for everything else that can’t be prerendered in some manner via SSG or deferred rendering – content that is personalized, changes frequently, or otherwise requires some kind of processing on each request. Yes, that’s a bit vague but essentially falls back on SSR when any of the above static strategies aren’t workable for a particular route.

OH: “Edge Rendering” is the new hotness!

As if things weren’t complicated enough, we now have the ability to render pages at the CDN level (i.e. the “edge”). This isn’t available everywhere yet but many of the major deployment providers like Cloudflare, Netlify, and Vercel have introduced it in some form.

  • In Cloudflare, every Cloudflare Worker is deployed across its CDN network, meaning that every function you deploy to Cloudflare is basically an edge function. Cloudflare allows you to render HTML within this function – meaning you could technically serve the entire site this way – or even use their HTMLRewriter to modify an existing page at the CDN level before it is ever returned to the user.
  • Vercel recently released their Edge Functions beta which work with any site deployed to Vercel, but also integrate directly with Next.js via Next.js Middleware, which was released in Next.js 12.
  • Netlify has Edge Handlers, which are currently in a private beta.

In most cases, you will probably not be rendering the entire request at the CDN level, but rather doing things like checking for authentication and updating the UI, changing the text for A/B testing, or modifying the page response with personalization.

What works where?

You may be asking, do I have to use a particular vendor to use certain types of rendering in Jamstack?

Well, kinda, though not really. Most vendors support both static rendering and SSR out-of-the-box for the major frameworks that offer it (Next.js, ,Nuxt.js and Gatsby), though Eleventy’s new Eleventy Serverless plugin for SSR only supports Netlify as of this writing. ISR works on both Vercel and Netlify.

There is some degree of vendor lock-in some cases, but most of this really is just about framework support. By this, I mean that if the platform says it supports the version of whatever framework you choose, then it likely supports all the different rendering options. Here’s a quick overview of support across Netlify, Vercel, Cloudflare and Gatsby Cloud.

  • static rendering – Everywhere.
  • server-side rendering – Most platforms have support. Check support based upon your specific framework (i.e. Next.js, Nuxt 3, Gatsby 4, Eleventy Serverless).
  • deferred rendering – Netlify, Vercel, Gatsby Cloud, AWS Amplify, Layer0.
  • edge rendering – Netlify in private beta only, Vercel in beta, all Cloudflare Workers. AWS via Lambda@Edge.

It’s not as confusing as it may seem

I know I’ve thrown a lot of information at you here, so let’s do a quick review. Below is a quick overview of different types of rendering options we discussed that are currently available in Jamstack applications and a quick rule of thumb on when to use them:

  • static rendering – Use this as your default.
  • deferred rendering – Use this when you have to render a lot of pages.
  • server-side rendering – Use this judiciously whenever the content cannot be statically rendered with one of the above.
  • edge rendering – Use this in specific scenarios when you need to modify pages that have already been rendered using one of the above methods.

While having so many different rendering options can be tough to get your head around at first, it also offers Jamstack developers a ton of power, allowing us to focus on delivering every page in the most effective – and fastest – way possible.

Written by

Brian Rinaldi Writing Program Member

Brian Rinaldi is a Developer Experience Engineer at LaunchDarkly with over 20 years experience as a developer for the web. Brian is actively involved in the community running developer meetups via CFE.dev and Orlando Devs. He's the editor of the Jamstacked newsletter and co-author of The Jamstack Book from Manning.

Readers also enjoyed

Optimizing DrSmile's Website to Allow for Lightning Fast Experiments at the Edge

To help boost DrSmile's rapid growth, we built the optimal environment for creating fast experiments. Directly at the edge.

Read article »

Jamstack Explained

Jamstack is not about a specific technology. It is a modern web development architecture that emphasizes the use of CDNs and decoupling services.

Read article »

Edge: power of the server, speed of the CDN.

More power and closer to the people. Modify, customize and personalize the content. Directly at the edge.

Read article »