Streamlining File Operations in Node.js with Express and Fastify
Takashi Yamamoto
Infrastructure Engineer · Leapcell

Introduction
In modern web applications, handling file uploads and downloads is a common requirement. Whether it's users uploading profile pictures, documents, or administrators distributing large datasets, the efficiency of these operations significantly impacts user experience and server performance. Traditional approaches often involve loading entire files into memory before processing, which can be inefficient and even lead to out-of-memory errors when dealing with large files. This is where the power of Node.js Streams comes into play. By leveraging streams, we can process files chunk by chunk, drastically reducing memory footprint and improving responsiveness. This article delves into how to effectively utilize streams within popular Node.js frameworks like Express and Fastify for robust and scalable file handling.
Core Concepts
Before we dive into the practical implementations, let's establish a clear understanding of the core concepts related to this topic:
- Streams: In Node.js, Streams are an abstract interface for working with streaming data. They are instances of
EventEmitterand provide a way to handle data in smaller, manageable chunks rather than loading it all at once into memory. This is crucial for handling large files or continuous data flows. - Readable Stream: A type of stream from which data can be read. Examples include
fs.createReadStream()for reading files orhttp.IncomingMessage(request object in HTTP servers). - Writable Stream: A type of stream to which data can be written. Examples include
fs.createWriteStream()for writing to files orhttp.ServerResponse(response object in HTTP servers). - Duplex Stream: A stream that is both Readable and Writable. Sockets are a good example.
- Transform Stream: A type of duplex stream that can modify or transform data as it is written and then read. Examples include zlib streams for compression/decompression.
- Piping: A fundamental stream concept where the output of a Readable Stream is connected to the input of a Writable Stream. This allows data to flow directly from one stream to another, efficiently without buffering the entire data in between.
source.pipe(destination)is the common syntax. - Bussboy (Fastify): A highly performant multipart/form-data parser specifically designed for Fastify, often used for handling file uploads.
- Multer (Express): A middleware for Express.js that handles
multipart/form-data, primarily used for uploading files. While Multer can handle streams, its default behavior often involves buffering files to disk entirely, which can be less efficient than a purely stream-based approach for very large files.
Efficient File Uploads
The traditional way of handling file uploads, especially with middleware like Multer, often involves saving the entire file to a temporary disk location first, or even buffering it in memory. While convenient for smaller files, this can become a bottleneck for larger ones. Stream-based uploads allow us to process or store the file chunk by chunk as it arrives.
Express.js with Streams for Uploads
For Express, we can combine a custom middleware with a library like busboy (Node.js's native multipart/form-data parser, not Fastify's busboy library) or handle the incoming request stream directly. Let's look at an example using busboy for a more structured approach:
const express = require('express'); const Busboy = require('busboy'); const fs = require('fs'); const path = require('path'); const app = express(); const uploadDir = path.join(__dirname, 'uploads'); // Ensure upload directory exists if (!fs.existsSync(uploadDir)) { fs.mkdirSync(uploadDir); } app.post('/upload', (req, res) => { const busboy = Busboy({ headers: req.headers }); let fileName = ''; busboy.on('file', (fieldname, file, filename, encoding, mimetype) => { fileName = filename; const saveTo = path.join(uploadDir, path.basename(filename)); console.log(`Uploading: ${saveTo}`); file.pipe(fs.createWriteStream(saveTo)); }); busboy.on('field', (fieldname, val, fieldnameTruncated, valTruncated, encoding, mimetype) => { console.log(`Field [${fieldname}]: value: %j`, val); }); busboy.on('finish', () => { console.log('Upload complete'); res.status(200).send(`File '${fileName}' uploaded successfully.`); }); busboy.on('error', (err) => { console.error('Busboy error:', err); res.status(500).send('File upload failed.'); }); req.pipe(busboy); }); app.listen(3000, () => { console.log('Express Upload Server listening on port 3000'); });
In this Express example, req.pipe(busboy) is the key. The incoming HTTP request (which is a Readable Stream) is piped directly into busboy. As busboy parses the multipart data, it emits a file event, providing a file stream for the uploaded file. This file stream is then directly piped to a fs.createWriteStream, saving the file to disk chunk by chunk without buffering the entire file in memory.
Fastify with Stream for Uploads
Fastify has excellent native support for streams and its ecosystem thrives on performance. The fastify-multipart plugin internally uses busboy (the Fastify-specific one) to efficiently handle file uploads.
const fastify = require('fastify'); const fs = require('fs'); const path = require('path'); const pump = require('pump'); // A utility for piping streams with error handling const app = fastify({ logger: true }); const uploadDir = path.join(__dirname, 'uploads'); // Ensure upload directory exists if (!fs.existsSync(uploadDir)) { fs.mkdirSync(uploadDir); } app.register(require('@fastify/multipart'), { limits: { fileSize: 10 * 1024 * 1024 // 10 MB limit for example } }); app.post('/upload', async (request, reply) => { const data = await request.file(); // Get the file stream if (!data) { return reply.code(400).send('No file uploaded.'); } const { filename, mimetype, encoding, file } = data; const saveTo = path.join(uploadDir, filename); try { await pump(file, fs.createWriteStream(saveTo)); reply.code(200).send(`File '${filename}' uploaded successfully.`); } catch (err) { request.log.error('File upload error:', err); reply.code(500).send('File upload failed.'); } }); app.listen({ port: 3000 }, (err) => { if (err) { app.log.error(err); process.exit(1); } app.log.info(`Fastify Upload Server listening on ${app.server.address().port}`); });
In the Fastify example, request.file() asynchronously retrieves the file data, which itself includes a readable file stream. pump is then used to safely pipe this incoming file stream to a fs.createWriteStream. pump is particularly useful because it handles closing streams and propagating errors correctly, making stream piping more robust. This approach ensures that the file is processed and written to disk incrementally.
Efficient File Downloads
Serving large files for download also benefits immensely from streams. Instead of loading the entire file into server memory and then sending it, we can create a readable stream from the file and pipe it directly to the HTTP response.
Express.js with Streams for Downloads
const express = require('express'); const fs = require('fs'); const path = require('path'); const app = express(); const downloadsDir = path.join(__dirname, 'downloads'); const sampleFilePath = path.join(downloadsDir, 'sample-large-file.txt'); // Create a dummy large file for testing downloads if (!fs.existsSync(downloadsDir)) { fs.mkdirSync(downloadsDir); } if (!fs.existsSync(sampleFilePath)) { const dummyContent = 'This is a sample line for a large file.\n'.repeat(100000); // ~5MB file fs.writeFileSync(sampleFilePath, dummyContent); console.log('Created a sample large file:', sampleFilePath); } app.get('/download/:filename', (req, res) => { const filename = req.params.filename; const filePath = path.join(downloadsDir, filename); if (!fs.existsSync(filePath)) { return res.status(404).send('File not found.'); } // Set appropriate headers for download res.setHeader('Content-Type', 'application/octet-stream'); res.setHeader('Content-Disposition', `attachment; filename="${filename}"`); const fileStream = fs.createReadStream(filePath); // Error handling on the stream fileStream.on('error', (err) => { console.error('Error reading file for download:', err); res.status(500).send('Could not retrieve file.'); }); fileStream.pipe(res); // Pipe the file stream directly to the response }); app.listen(3000, () => { console.log('Express Download Server listening on port 3000'); });
Here, fs.createReadStream(filePath) creates a readable stream from the file on disk. This stream is then directly piped to res (the HTTP response object, which is a Writable Stream). This means as chunks of the file are read from disk, they are immediately sent to the client, without buffering the entire file in the server's memory. This is highly efficient for large files and works well with progress indicators on the client side.
Fastify with Streams for Downloads
Fastify's reply object also behaves as a writable stream, making stream-based downloads straightforward.
const fastify = require('fastify'); const fs = require('fs'); const path = require('path'); const pump = require('pump'); const app = fastify({ logger: true }); const downloadsDir = path.join(__dirname, 'downloads'); const sampleFilePath = path.join(downloadsDir, 'sample-large-file.txt'); // Create a dummy large file for testing downloads if (!fs.existsSync(downloadsDir)) { fs.mkdirSync(downloadsDir); } if (!fs.existsSync(sampleFilePath)) { const dummyContent = 'This is a sample line for a large file.\n'.repeat(100000); // ~5MB file fs.writeFileSync(sampleFilePath, dummyContent); app.log.info('Created a sample large file:', sampleFilePath); } app.get('/download/:filename', (request, reply) => { const filename = request.params.filename; const filePath = path.join(downloadsDir, filename); if (!fs.existsSync(filePath)) { return reply.code(404).send('File not found.'); } // Set appropriate headers for download reply.header('Content-Type', 'application/octet-stream'); reply.header('Content-Disposition', `attachment; filename="${filename}"`); const fileStream = fs.createReadStream(filePath); // Use pump for robust piping and error handling pump(fileStream, reply.raw, (err) => { if (err) { request.log.error('Error during file download:', err); // It might be too late to send an error status if headers are already sent // Consider logging the error and letting the connection close. } else { request.log.info(`File '${filename}' sent successfully.`); } }); }); app.listen({ port: 3000 }, (err) => { if (err) { app.log.error(err); process.exit(1); } app.log.info(`Fastify Download Server listening on ${app.server.address().port}`); });
Similar to Express, fs.createReadStream(filePath) creates a readable stream. Fastify's reply.raw provides access to the underlying Node.js http.ServerResponse object, which is a writable stream. We then use pump to pipe the file stream to reply.raw, ensuring efficient data transfer and robust error handling.
Conclusion
Leveraging Node.js Streams for file uploads and downloads in Express and Fastify provides a highly efficient and scalable solution, particularly for handling large files. By processing data in chunks rather than buffering entire files in memory, applications can significantly reduce their memory footprint, improve performance, and enhance user experience. Adopting stream-based approaches is a crucial step towards building performant and resilient file handling capabilities in your Node.js web applications. This elegant plumbing allows for resource-efficient data flow, making your applications more robust and scalable.

