Since I published my 2019 in Review, detailing the industry trend towards static hosting and the rise of Jamstack CDNs like Vercel, a question has persistently come up: how is this different from a server and a traditional CDN on top? Why not then just use servers with containers (even serverless ones)?

The answer boils down to our ability to not just host static assets and cache them, but also hoist them.

The word hoisting is used quite frequently in JavaScript to describe how the compiler extracts and “moves” declarations to the “top” of a scope, which makes the following code actually work:

function a () {
  return b()  // this works with no errors!
  function b() {
    return "hello world"

Calling b() before it was defined works because the declaration was hoisted

The term is also used in compiler optimization lingo to describe a class of optimization where the code is analyzed, and parts that look “static” or invariant are moved (hoisted) outside of loops.

function a (b, c) {
  let sum = []
  for (let i = 0; i < 100000; i++) {
    sum[i] = i + (b + c) // hoist ↗
a(314, 159)

Notice that the sum b + c has nothing to do with the context of the loop: it’s loop-invariant. An optimizing compiler can recognize it, and hoist it automatically so that the computation behaves as if you had written it outside of the loop yourself.

Hoisting computation within a program is great, and odds are the compilers and VMs you use every day have plenty of optimizations like it.

What Jamstack as a software architecture has now made possible, however, is to hoisting the results of computation to the edge, right next to where your visitors are.

A core tenet of Jamstack has been to pre-render (pre-compute) as much as possible, which has given prominence to static site generation. The key idea is that computation that would have happened later on, in the request’s timeline, has now been shifted to the build phase, performed once and made available for all users to share.

User RequestEdge ServerStatic HoistingComputation done ahead-of-timeand always shared by all edgesComputation done just-in-timeand partially² shared upon cache HITProxy to a server¹Legacy CDN (JIT)Jamstack (AOT)PerformanceAvailabilityCost✓ Optimal performance✓ Faster cache misses✓ Always online✓ Automatic global failover✓ Optimally inexpensive✓ Zero maintenance overhead⨯ Dependent on DevOps / SRE⨯ Expensive HA (multi-AZ)³⨯ Servers constantly running³⁴⨯ DevOps / Monitoring / SRE⨯ Slower cache misses⨯ Impacted by cold boots³¹ The downsides of this approach apply equally to server-rendering and operating your own static file server² Cache hits will be more rare for less-trafficked pages or sites, and will be highly region-dependent ³ Cold boots can be atenuated by Lambda Provisioning, which drives costs up quite significantly⁴ Functions and serverless containers provide natural multi-az, but are subject to higher costs due to [3]

Since Next.js 9, the next build process has started automatically outputting the optimal asset on a per-page (entrypoint) basis. Futher, with Next.js 9.3 the hooks for static-site generation were refined, and incremental static generation was introduced for appending (and soon updating) pages.

Next.js makes hoisting the static parts of your site or app to the edge a breeze. Let’s look at the build output of a complex site ( as an example:

$ next build
  Page                                                           Size     First Load JS
  ┌ ○ /                                                          215 B           191 kB
  ├ ○ /about                                                     4.53 kB         171 kB
  ├ λ /api/sso
  ├ ○ /dashboard                                                 228 B           192 kB
  ├ ○ /bitbucket                                                 11 kB           164 kB
  ├ ● /blog                                                      54.6 kB         224 kB
  ├   └ css/a73431369cd0a9ce449f.css                             960 B
  ├ ● /blog/[post]                                               3.54 kB         206 kB
  ├   ├ /blog/environment-variables-ui
  ├   ├ /blog/simpler-pricing
  ├   ├ /blog/canceling-ongoing-deployments
  └   └ [+95 more paths]
λ  (Lambda)  server-side renders at runtime (uses getInitialProps or getServerSideProps)
○  (Static)  automatically rendered as static HTML (uses no initial props)
●  (SSG)     automatically generated as static HTML + JSON (uses getStaticProps)

The outputs of the Next.js build process vary by the data fetching strategy used by the developer

As you can see from the symbols (○ ● λ) to represent the output types:

  • We build the homepage statically. Our company’s homepage, our “cover letter”, gets built as index.html and pushed to the edge for maximum speed and reliability.
  • The /api functions are exported as serverless (lambda) functions. These are created by placing files inside ./pages/api (more).
  • At build time, we create blog posts by querying an API using Static Site Generation hooks. Blog posts, like our homepage, are thus statically hoisted (optimized) to the edge network.
  • Our dashboard, despite being super dynamic in nature, is a static HTML page that queries data securely using React Hooks from the client side.

CDNs are great and have been around for a long time, and so have static hosts. However, for the longest time, CDNs have been treating the “origin” as an opaque black box.

It’s now possible, instead, to push content directly to the network and design frameworks that optimize for this capability. As a result, with optimizations like static asset hoisting, websites are now becoming faster and more reliable than ever before.

To start developing a Jamstack site or app, check out Next.js which you can deploy to the Vercel edge network with a couple clicks.

Source link

Write A Comment