The body is not fully stored as part of the response, the reason for this is because the body can contain a very large amount of data, if it was to be stored on the response object, the program's memory would be consumed quite fast.
Instead, Node.js uses a mechanism called Stream. This mechanism allows holding only a small part of the data and allows the programmer to decide if to fully consume it in memory or use each part of the data as its flowing.
There are multiple ways how to fully consume the data into memory, since HTTP Response
is a readable stream, all readable methods are available on the res
object
listening to the
"data"
event and saving the chunks passed to the callbackconst chunks = []res.on("data", (chunk) => { chunks.push(chunk)});res.on("end", () => { const body = Buffer.concat(chunks);});
When using this approach you do not interfere with the behavior of the stream and you are only gathering the data as it is available to the application.
using the
"readble"
event and callingres.read()
const chunks = [];res.on("readable", () => { let chunk; while(null !== (chunk = res.read())){ chunks.push(chunk) }});res.on("end", () => { const body = Buffer.concat(chunks);});
When going with this approach you are fully in charge of the stream flow and until res.read
is called no more data will be passed into the stream.
using an async iterator
const chunks = [];for await (const chunk of readable) { chunks.push(chunk);} const body = Buffer.concat(chunks);
This approach is similar to the "data" event
approach. It will just simplify the scoping and allow the entire process to happen in the same scope.
While as described, it is possible to fully consume data from the response it is always important to keep in mind if it is actually necessary to do so. In many cases, it is possible to simply direct the data to its destination without fully saving it into memory.
Node.js read streams, including HTTP response, have a built-in method for doing this, this method is called pipe
. The usage is quite simple, readStream.pipe(writeStream);
.
for example:
If the final destination of your data is the file system, you can simply open a write stream to the file system and then pipe
the data to ts destination.
const { createWriteStream } = require("fs");const writeStream = createWriteStream("someFile");res.pipe(writeStream);