Why does fetch make you wait twice?

The two awaits


const response = await fetch('/api/pokemon');
const data = await response.json();
            

Why the second one?
We already have our response

Right?

Is parsing JSON just that slow?

							const data = await response.json();
    	        		
Nope. JSON.parse does ~170 MB/s.
Parsing isn't the bottleneck.

							const data = JSON.parse(myJsonString);
            			

Source: GoogleChromeLabs/json-parse-benchmark

We also have .text() 🏃 and .blob() 🚀

							const text = await response.text();
							const blob = await response.blob();
            			
Ok, so what does fetch actually return?
The

✨Response✨

Object
The ✨Response✨ Object
We have methods to read the body:

							const response = await fetch("/api/pokemon");
							const text = await response.text();
							const json = await response.json();
							const blob = await response.blob();
						
But what if we just log it?

							const response = await fetch("/api/pokemon");
							console.log(response);
						

							ok: true
							status: 200
							statusText: ""
							headers: Headers {}
							url: "/api/pokemon"
							type: "basic"
							redirected: false
							body: ReadableStream
							bodyUsed: false
						
What do we actually send back and forth?

HTTP Request


							const response = await fetch("/api/pokemon");
						

							GET /api/pokemon HTTP/1.1
							Host: fetch-server.fly.dev
							Accept: application/json
						

HTTP Response


							HTTP/1.1 200 OK
							Content-Type: application/json
							Content-Length: 214016

							[{
								"id": 1,
								"name": {
								"english": "Bulbasaur",
								"japanese": "フシギダネ"
								},
								"type": ["Grass", "Poison"],
							...
							... much more data ...
						

The important bit!

fetch returns as soon as headers have arrived

The body could still be on its way

This is gonna take a while

Bytes still have to travel

Back to the response


						ok: true
						status: 200
						statusText: ""
						headers: Headers {}
						url: "/api/pokemon"
						type: "basic"
						redirected: false
						body: ReadableStream
						bodyUsed: false
					

What is a stream?

Credit: juancajuarez - stock.adobe.com

A continuous stream of bytes

Credit: juancajuarez - stock.adobe.com

Credit: Komarov Andrey - Fotolia

Buffer up some data, then handle it

Credit: Komarov Andrey - Fotolia

Simulated Slow Connection


import express from "express";
import path from "path";
import fs from "node:fs";

const app = express();
const port = 3000;

app.get("/api/pokemon", (req, res) => {
  res.status(200);
  res.setHeader("Access-Control-Allow-Origin", "*");
  res.setHeader("Content-Type", "application/json");

  // 209 KB
  const filePath = path.join(__dirname, "pokemon.json");
  const stream = fs.createReadStream(filePath, { encoding: "utf8" });

  stream.on("readable", () => {
    const intervalId = setInterval(() => {
      const chunk = stream.read(1);
      if (chunk !== null) {
        res.write(chunk);
      } else {
        clearInterval(intervalId);
        res.end();
      }
    }, 5); // 1 byte every 5 ms = 0.2KB/s
  });
});
					

Is this too slow?

Dial-up in the 1970s was 50% faster than this Dial-Up Modems
However, this is a small file at 209KB
pokemon.json
And 1080p video today is
$3.5\text{x}$ this file every second

Fetching The Whole Thing


							const response = await fetch("/api/pokemon");
							const data = await response.json();
						

Time taken: 0s

How long will this take?

$$ \frac{\text{filesize}}{\text{speed}} = \text{time} $$
$$ \frac{209\text{KB}}{0.2\text{KB}/s} = 1045s $$

That's $17.5$ minutes!

Is there a better way?

Of course!

🌊 Streaming 🌊

Streaming

response.body is a ReadableStream

This means we can consume the data as it reaches us


							const response = await fetch("/api/pokemon");
							const decoder = new TextDecoder("utf-8");

							let data = "";
							for await (const chunk of response.body) {
								data += decoder.decode(chunk);
							}
						

Time taken: 0s

Both methods will finish after 17-ish minutes

But the user is getting content for one of them!

streaming is awesome

Actually pretty clever

  • We have the headers, so we can act on them right away
  • The body isn't always finished (think Server-Sent Events)
  • We can show something before everything is downloaded

Should we stream everything?

Of course not!

The regular way is fine for most things


Consider streaming if you have:

  • Extra Large Payloads
  • Extra Small Speeds

…or paginate, fetch a few things at a time

And that's why you have to wait twice

Thank you!

My name is Truls

Slides on github:
github.com/trulshj/presentation