Published on

Understanding Node's Event Driven Architecture

Authors

One thing that makes Node.js particularly interesting is its event-driven architecture. A Node server can accomplish speeds much faster than similar technolgoies because of its extensive use of events. The original, and officially supported way of handling events is with callbacks. However, these can be difficult to read when dealing with a lot of nested callbacks. For cleaner, more readable code and code that is easier to maintain, we have promises and async/await. These will work just fine for smaller applications, but for larger applications where you have perhaps a multitude of objects that might respond to certain events we can use event emitters and event listeners. We'll take a look at all of these methods and how they work in this article.

Node runs on what is called an event loop. This is a loop that runs in the background and constantly checks for events. When an event occurs, the event loop will run the appropriate code. With a node function that runs with a callback, the callback will run on the next tick of the event loop.

Here's a simple example of a callback function that grabs a list of users from a file and and forms them into an array, then calls a callback function.

const fs = require('fs');
const readUsersFileToArray = function(file, cb) {
  const users = [];
  fs.readFile(file, 'utf8', (err, data) => {
    if (err) {
      cb(err);
    } else {
      const users = data.split('\n');
      cb(null, users);
    }
  });
}

Imagine an olympic team running a relay race, but the baton is actually a capsule filled with code. When the race starts, the capsule is filled with startup code. Racer 1 goes around the track with the startup config then empties the capsule before passing the baton, starting the event loop. At this point the baton contains our readUsersFileToArray function along with the file to be processed. Racer 2 sprints around the track while processing the file then stuffs the array into the baton capsule and passes the baton to the next racer, who runs the callback function. If the callback function has a its own callback function, then that code will run when racer 4 has the baton. When there are no operations to execute, the team keeps on racing around the track, waiting for new code to appear in the baton-capsule for processing.

Events with Promises

Here is the same example using promises.

const fs = require('fs');
const readUsersFileToArray = function(file) {
  return new Promise((resolve, reject) => {
    fs.readFile(file, 'utf8', (err, data) => {
      if (err) {
        reject(err);
      } else {
        const users = data.split('\n');
        resolve(users);
      }
    });
  });
}

This code is a little easier to understand and work with, but keep in mind that Node's official method for working with async code is callbacks. This means that developers using the code might assume that you have a callback interface.

A popular method currently found in many popular node packages is to actually keep the callback interface and return a promise, giving developers the option for how they want to work with the code. For this, we'd take the original code for readUsersFileAsArray and simply add a promise to it.

const fs = require('fs');
const readUsersFileToArray = function(file, cb = () => {}) {
  return new Promise((resolve, reject) => {
    fs.readFile(file, 'utf8', (err, data) => {
      if (err) {
        reject(err);
        return cb(err);
      } else {
        const users = data.split('\n');
        resolve(users);
        cb(null, users);
      }
    });
  });
}

Note that when using this method, we have to set the callback function to a default empty function. This is because the callback function is optional and if it's not set and the promise interface is used, it will throw an error.

Events with Async/Await

Callbacks are messy and a bit of a pain to work with. Promises improve on that, but async/await is the preferred alternative. This allows us to treat async code as if it was lineary and it is more readable when we need to process things in loops. The following function calls the readUsersFileToArray function and then waits for the promise to resolve, at which point it will filter out and return every other user.

async function everyOtherUser() {
  try {
    const users = await readUsersFileToArray('./users');
    const everyOther = users.filter((user, index) => index % 2 === 1);
    return everyOther;
  } catch (err) {
    return err;
  }
}

Note that we always wrap the operation in a try/catch when working with async/await.

Working with Event Emitters

Node's EventEmitter is a simple but very powerful way of working with events. It is a class that allows us to fire off events and then listen for them. Think of EventEmitter as the facilitator of communication between objects in node. This class is at the core of Node's async event driven architecture. In fact, many of node's core modules are built on top of EventEmitter.

Emitter objects simply emit named events that cause any object that is listening for those events to run event-handling code. The following code creates a new EventEmitter object and then adds a listener to the 'newUser' event.

An emitter object has two main eatures: emitting named events and registering listener functions. To work with the event emitter, simply create a class that textends EventEmitter and then use the on method to register a listener function.

const EventEmitter = require('events');
class UserEmitter extends EventEmitter {}
const userEmitter = new UserEmitter();
emitter.on('newUser', (user) => {
  console.log(`New user: ${user}`);
});

An emitter is a signal that a condition occurred, which is usually about a state change in the emitting object. We can add listener objects that do something any time the emitter object emits their associated named event. To understand the sequence of events, we can use a LogEvents class that logs the events that are emitted.

class LogEvents extends EventEmitter {
  execute(taskFunc) {
    console.log('Before executing');
    this.emit('start');
    taskFunc();
    this.emit('finish');
    console.log('After executing');
  }
}

const logEvents = new LogEvents();
logEvents.on('start', () => {
  console.log('About to execute');
}).on('finish', () => {
  console.log('Finished executing');
}).execute(() => {
  console.log('Executing');
});

This will produce the following output:

Before Executing
About to execute
Executing
Finished executing
After Executing

It is important to understand that events do not mean sync or async. In fact, this example is not async. In order to emit an event after an async function is done, we'd need to combine callbacks or promises with this event-based communication.

Here is how we can convert this example to be async, but instead of logging the order of events, we'll just log the start and finish along with the amount of time taken to execute:

class LogEvents extends EventEmitter {
  execute(asyncFunc, ...args) {
    console.time('execute');
    this.emit('start');
    asyncFunct(...args, (err, data) => {
      if (err) {
        return this.emit('error', err);
      }
      this.emit('data', data);
      console.timeEnd('execute');
      this.emit('finish');
    });
  }
}

const logEvents = new LogEvents();
logEvents.on('start', () => {
  console.log('About to execute');
}).on('finish', () => {
  console.log('Finished executing');
}).execute(() => {
  console.log('Executing');
});

logEvents.execute(fs.readFile, __filename);

This will produce the output:

About to execute
execute: 2.141ms
Finished executing

One benefit for using events instead of callbacks is that we can react to the same signal multiple times by defining multiple listeners. To accomplish this with callbacks, we'd have to write more logic for this inside the single available callback.

When emitting an event, additional arguments can be used after the named event, like passing the data in this.emit('data', data)' and all arguments will be available to the listener functions. IMPORTANT: if we don't handle the error event with a listener function, the node process will exit. Thus, a logEvents.on('error', errorHandler); should be added to the above scenario.

Node is a fascinating platform to work with, and much of what makes it unique is the ability to work with events. Many of Node's built-in modules are event emitters. For example, the fs module is a file system module that emits events when files are read, written, or deleted. Similarly, the http module is a web server that emits events when requests are made to the server. With this in mind, understanding this event architecture is crucial for working effectively with Node.