NodeJS: File Streams, Reading, and Piping

Lately I’ve been working with Node.js to try understanding back-end environment with JavaScript. I think it is fair to say that Node.js offers a smooth transition from front-end JavaScript to Back-end. Concepts like event loops, closures and variable scoping provide a preferable mindset to work with server management. While I have worked with Python and PHP back-ends, using Node.js was the first time I felt like I had a clue on what was going on. To prove how smooth the transition was, I managed to debug an error in less than 30 seconds without any prior experience working with file reading, streaming or piping.


Node.js, like any back-end language, offers a simple file reading and writing module. And like most Node data modules, it uses streams abstraction to communicate between services. Streams are one of those core node ideas that are found in most data services. There are two types of Streams: readable and writeable streams. Each has its own events, behaviours and can be controlled with event listeners to your own needs. You can listen to events like connections, data, stream ending and various others (depending on the module).

Stream Problems:

However, data transfer usually come with some problems. Say you have a client to whom the server is sending data. If you happen to be reading data from an outside source, you will be doing two processes. Reading that data and sending it to the client. At certain points, you will find the client to be consuming the data slower than you are sending it. If this behaviour isn’t checked, you will end up with a slow client problem. Fortunately, Node allows you to pause streams and listen for any drained data events from client writing process.

  var fs = require("fs"),
      http = require("http");
  var readFile = fs.createReadStream("my_file.txt");

  http.createServer(function(request, response){
    readFile.on("data", function(data){
        readFile.pause(); //if we aren't writing to the client, dont read anything!
    response.on("drain", function(){
    readFile.on("end", function(){

Using stream.pipe():

The pause and resume pattern is found repeatedly whenever data transfer is concerned throughout Node. It has solution that is simple and cleanly abstracted for most streams. A simple stream.pipe() functions magically takes care of this. In short, you have a readable stream piping to a writeable stream (i.e. readStream.pipe(writeStream)). In the previous example, you can shorten it to:

  var fs = require("fs"),
      http = require("http");
  var readFile = fs.createReadStream("my_file.txt");
  http.createServer(function(request, response){

That is it; except for a simple error that I mentioned in the first paragraph. Can you spot it?

Hint #1: Try refreshing the page.

Hint #2: Your node service needs to create a new read stream for each request.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s