Skip to content

Commit

Permalink
chore(content): improved slides for suspense ssr lesson
Browse files Browse the repository at this point in the history
  • Loading branch information
ywarezk committed Jan 21, 2024
1 parent 491ab20 commit 4137b96
Show file tree
Hide file tree
Showing 5 changed files with 90 additions and 3 deletions.
81 changes: 80 additions & 1 deletion content/en/course/react/suspense/nextjs-streaming/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,23 @@ description: How to use React Suspense to stream data in NextJS
publishDate: "2024-01-14"
---

In this lesson we will learn how the backend server deals with `<Suspense>` and how `<Suspense>` effects Server-Side-Rendering (SSR).

## Server-Side-Rendering (SSR)

In server side rendering a backend node will evaluate our React components to create the initial HTML page. This HTML page will be sent to the browser and the browser will render it.
The generated HTML will contain data fetched from api, so the user will not see a simple loading screen but actual content. The site will load fast even with slow internet connection, the server will fetch the data which will be fast.
We can generate the HTML during build time - this is called Static Site Generation (SSG) or we can generate the HTML during runtime - this is called Server-Side-Rendering (SSR). SSG is relevant for specific sites where the HTML depends on the build and not on the user entering the site, we will not touch SSG and will discuss SSR where the backend needs to generate HTML with every request.

On SSR when the server is getting the request it will evaluate the react components, grab the data from the api, generate HTML and then send the HTML to the browser. The client will render the HTML, will download the JS files, and then Hydrate the HTML. Hydration means that the client will attach event listeners to the HTML elements and will make the HTML interactive. After hydration the site will be interactive and will behave like a regular React app, and the logic will arrive from the js files.

![SSR](https://github.com/ywarezk/academeez/blob/main/content/en/course/react/suspense/nextjs-streaming/ssr.png?raw=true)

Few definitions that we should be acquainted with:
- TTFB - Time To First Byte - the time it takes for the server to send the first byte of data to the browser.
- FCP - First Contentful Paint - the time it takes for the browser to render the first HTML element.
- TTI - Time To Interactive - the time it takes for the browser to finish the hydration process and the site is interactive.

## SSR and TTFB problem

Let's demonstrate a problem we had in the past with SSR. The easiest way to create an SSR react web site is by using [NextJS](https://nextjs.org/). NextJS is a framework that allows you to create a React web site with SSR out of the box. It's very easy to use and it's very popular. Let's create a simple NextJS app:
Expand All @@ -14,4 +31,66 @@ npx create-next-app@latest

You will be asked a few questions:

![create-next-app](./create-next-app.png)
![create-next-app](https://github.com/ywarezk/academeez/blob/main/content/en/course/react/suspense/nextjs-streaming/questions.png?raw=true)

open the created project with your ide.
You can run the project using

```bash
npm run dev
```

Let's imagine that our homepage (file: `app/page.js`) has 2 server requests (we will mimic them using timers).

```js title="app/page.js"
'use client';

import {use} from 'react';

// generates a promise that resolves after the given time
function createTimerPromise(time, message) {
return new Promise(resolve => {
setTimeout(() => {
resolve(message);
}, time);
});
}

// promise that resolves after 2 seconds
let shortQuery = createTimerPromise(2000, 'short query');

// promise that resolves after 5 seconds
let longQuery = createTimerPromise(5000, 'long query');

export default function Home() {
const short = use(shortQuery);
const long = use(longQuery);

return (
<>
<div>{short}</div>
<div>{long}</div>
</>
);
}

```

If you look at the network tab you will see that the page is rendered only after 5 seconds. The time it takes for the server to grab data, and the user waiting without seeing anything is called TTFB (Time To First Byte). This is a problem because the user is waiting for a long time without seeing anything.

## Streaming HTML

To solve the TTFB problem we can stream the HTML. Streaming HTML means that the server will send the HTML in chunks. The browser will render the HTML as it arrives. The browser will not wait for the entire HTML to arrive, it will render the HTML in chunks as it arrives.
The server returns chunks of the HTML which are often referred to as "frames" or "chunks".
The browser can progressively render the HTML as it arrives, and does not have to wait for the entire HTML.
User can see the content faster, even part of the content is better than nothing, and some will arrive really fast, and the server calls will also be faster on the backend so overall the client will have everything load faster then if the client made the requests himself.
It's also less consuming on the server because the server does not have to generate the entire HTML before sending it to the client.
It can solve the TTFB problem cause now we can render part of the HTML after the **short query** arrives.

## Suspense and Streaming

The backend evaluates `<Suspense>` and a streaming frame, it will evaluate the `<Suspense>` promises, and if they are pending (atleast one) it will stream the fallback, if all the promises are resolved it will stream the children, and if atleast on promise is rejected it will stream the content of the error-boundary (or uncaught exception if there is no error-boundary).
Now that we know that `<Suspense>` is a streaming frame, we can wrap our calls under `<Suspense>` to send HTML content even after the first call.

The source code is located here: [source code](https://github.com/ywarezk/demo-streaming-suspense)

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
8 changes: 7 additions & 1 deletion content/en/course/react/toc.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,13 @@ const toc: NavItem = {
{
href: '/en/course/react/suspense',
title: 'Suspense',
items: [],
items: [
{
href: '/en/course/react/suspense/nextjs-streaming',
title: 'Next.js Streaming',
items: [],
}
],
},
{
href: '/en/course/react/trpc',
Expand Down
4 changes: 3 additions & 1 deletion content/en/course/react/trpc/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -480,4 +480,6 @@ export default function Page() {
## Summary

I've been using trpc for a while now, and I have to say that it's my favorite way to communicate between my frontend and backend. On my large projects with large teams, that extra staticness in frontend backend communication really reduces the amount of bugs, before trpc the layer of communication between backend and frontend always was prune to bugs, the client might expect a certain response and gettings something different while the server expects a certain input and gets something else, the amount of those bugs are drastically reduced with trpc, not that it's clear and enforced by typescript that the backend frontend communication is done properly.
There is a caveat that it works better if the frontend projects and backend projects sits together in a mono repo, the other alternative would require the server project to publish the app router type (with `npm publish` for example) which would turn things a bit more clumsy in my opinion. But if you are working in a mono repo, this static layer on your back-front communication would do wonders to your project.
There is a caveat that it works better if the frontend projects and backend projects sits together in a mono repo, the other alternative would require the server project to publish the app router type (with `npm publish` for example) which would turn things a bit more clumsy in my opinion. But if you are working in a mono repo, this static layer on your back-front communication would do wonders to your project.

The full source code is available [here](https://github.com/ywarezk/lecture-trpc)

0 comments on commit 4137b96

Please sign in to comment.