This post is part of a series of ES2015 posts. We'll be covering new JavaScript functionality every week!

In several previous posts we've been using Babel to actually run our (transpiled) code. But how far can we get if we only use natively supported features? Let's dig in the stable features of Node.js 4 and see how much of our code can be written without a transpiler dependency.

The setup for the prototype is a really simple HTTP server that serves different content under 3 paths:

  • The root path / serves a simple message as body
  • The async path /async serves a different message and simulates an asynchronous call to another system
  • Other paths give a 404 code.

We're not going to add third party http frameworks to our code and rely on the Node.js provided http server implementation for clarity.

Let's write a server implementation using the current Node.js. Based on the list of supported features, we can safely use the following native ES2015 constructs:

  • const and let
  • arrow functions
  • generator functions
  • Promise
  • Map
  • template strings
  • ... and a 'few' others, accounting for an implementation of 53% of the official ECMA spec

Server

We'll start with the initialisation of our server implementation.

'use strict';

const http = require('http'),
  co = require('co'),
  sleep = require('./sleep');

The first gotcha: to be able to use let and const in our code, we need to add the 'use strict' string "prolog" on top of our files. Else we're treated with a SyntaxError if we use them.

We have to fall back on require for module imports. Node.js 4 doesn't natively support the standardised import/export definitions yet.

Here we require the co lib to write nicer async code with generators, we'll come back to that. Also, for this demonstration, we've written an asynchronous sleep function which returns a Promise that sleeps for a specified time and then resolves with the specified result:

'use strict';

module.exports = (ms, result) => {
  return new Promise((resolve) => {
    setTimeout(() => {
      resolve(result);
    }, ms);
  });
};

With this sleep function, we'll simulate an asynchronous call to for instance a database and its response. We can provide arrow functions as an export as expected. Also, we don't have to require a promise library but immediately use the native Promise class.

Let's return to our server implementation.

const routeHandlers = new Map();

routeHandlers.set('/', (req, res) => {
  res.writeHead(200);
  res.end('okay');
});

For our route handling, we're using a Map. The advantage over using an object literal is that there are no inherited properties that may influence our routing.

Setting up route handling is again really basic. This particular handler simply responds with a 200 response with body 'okay'. The arrow function gives us improved readability. Of course, you should not build your own but use one of the many http frameworks with feature-packed routers if the need arises.

Now for the async call handling. We imagine a fake underlying system that requires us to do two calls: one to retrieve a certain value and a second call that uses the first output and provides the final output. Both of these calls return a Promise object. We could write the code with then and catch as follows:

routeHandlers.set('/async', (req, res) => {
  function errorHandler() {
    res.writeHead(500);
    res.end('internal error');
  }

  sleep(1000, 123).then((data) => {
    const a = data;
    sleep(2000, a + 456).then((data) => {
      const b = data;
      res.end(`asynchronous result: ${a} ${b}`);
    }).catch(errorHandler);
  }).catch(errorHandler);
});

We need to provide a generic error function to both calls to catch errors which is boilerplate-ish. And especially the nesting of the two then calls makes the flow of code harder to understand.

If we use the co helper function, we can use a generator function and yield as a 'waiting' construct and the code is more clear:

routeHandlers.set('/async', (req, res) => {
  co(function *() {
    const a = yield sleep(1000, 123);
    const b = yield sleep(2000, a + 456);
    res.end(`asynchronous result: ${a} ${b}`);

  }).catch(() => {
    res.writeHead(500);
    res.end('internal error');
  });
});

An extra benefit is that we only have to write one catch function to catch both promise failure paths. This idea of using generators and yield is used extensively in the koa http framework. It certainly seems a good combination with Node.js 4.

(Did you see how we used a template string to render our page result? Such wow.)

We continue with the creation of the server:

const server = http.createServer((req, res) => {
  if (routeHandlers.has(req.url)) {
    return routeHandlers.get(req.url)(req, res);
  }

  res.writeHead(404);
  res.end('not found');
});

This basically checks whether a route exactly matches one of the routes defined in the Map and delegates the handling to the specified function.

We want to expose the server's listen and close but no other functions:

exports.listen = function () {
  server.listen.apply(server, arguments);
};

exports.close = function () {
  server.close.apply(server, arguments);
};

Another gotcha: we can't use the arrow function syntax in this case. We want to apply the arguments array, and that is not provided inside arrow functions.

So, when we start our server we can see that this all works as expected. Great! But what about a test?

Test

For testing, we use the Mocha test framework and the Chai assertion library. Let's initialise the server test:

'use strict';

const server = require('../src/server'),
  getBody = require('./getBody'),
  co = require('co'),
  assert = require('assert'),
  port = 6000;

describe('server', () => {
  before(() => {
    server.listen(port);
  });

We can use the arrow function syntax for our describe and before calls, which is nice. The getBody function returns a Promise that resolves to the body contents (and the original response) of a http get call:

'use strict';

const http = require('http');

module.exports = (uri) => {
  return new Promise((resolve) => {
    const bodyChunks = [];
    http.get(uri, (res) => {
      res.on('data', (chunk) => {
        bodyChunks.push(chunk);
      }).on('end', () => {
        resolve({ res, body: bodyChunks.join() });
      });
    });
  });
};

We can define a const bodyChunks because the array itself is constant, but we can still manipulate the contents. We can add res in the object literal by using the shorthand property syntax to write a bit less code.

Let's return to our test:

  describe('/', () => {
    it('should return 200 and respond with okay', (done) => {
      co(function *() {
        const ret = yield getBody(`http://blog.xebia.com:${port}`);
        assert.equal(200, ret.res.statusCode);
        assert.equal('okay', ret.body);
        done();
      });
    });
  });

Here we use co again to write easy to follow asynchronous code. We can use a library like co-mocha to improve the readability of our tests even more, but for clarity we'll keep it like this. It would be nice if Mocha supported generator functions out-of-the-box!

Node.js 4 doesn't support destructuring yet, else we could destructure the returned object immediately in res and constants.

Now to test our longer-running asynchronous call:

  describe('/async', () => {
    it('should return 200 and respond with message', function (done) {
      this.timeout(4000);
      co(function *() {
        const ret = yield getBody(`http://blog.xebia.com:${port}/async`);
        assert.equal(200, ret.res.statusCode);
        assert.equal('asynchronous result: 123 579', ret.body);
        done();
      });
    });
  });

Here, we have to change the default timeout (2000ms) for tests. This can be done for a single it using this.timeout. But in an arrow function, this is not available, so we need to write the full function syntax.

The rest of the test is left out as it is trivial.

Conclusion

We've seen that it's possible to already write quite some sophisticated ES2015 without the use of any transpiler when we upgrade Node.js to 4+, give or take a few gotchas and some integration issues. No transpiler also means improved debugging and no more source maps, so that can be convincing argument for your next project.

Keep an eye on Chrome Status to track development of new EcmaScript features, as they eventually will pop up in Node.

For reference, you can find the working project code at this repo. You can start the server with npm start and run the Mocha test suite with npm test. Of course, make sure you have Node.js 4 set as your default.