Should we start using Node 8?

Node 8 has arrived, and it’s here to stay! With real-world usage demonstrating its reliability, this version boasts a speedier V8 engine and innovative features like async/await, HTTP/2, and async hooks. But the question remains: Is it the right fit for your project? Let’s delve in and find out!

Editor’s note: While you might be aware of Node 10 (Dubnium), we’re focusing on Node 8 (Carbon) for a couple of key reasons: (1) Node 10 has recently entered its long-term support (LTS) phase, and (2) Node 8 brought about more substantial changes compared to Node 10.

Performance Enhancements in Node 8 LTS

Let’s kick things off by examining the performance upgrades and fresh features packed into this noteworthy release. A significant area of enhancement lies within Node’s JavaScript engine.

But first, a quick refresher: What exactly is a JavaScript engine?

In essence, a JavaScript engine is responsible for executing and optimizing code. It could take the form of a standard interpreter or a just-in-time (JIT) compiler that transforms JavaScript into bytecode. The JS engines employed by Node.js are all JIT compilers, not interpreters.

The V8 Engine

Since its inception, Node.js has relied on Google’s Chrome V8 JavaScript engine, or V8 for short. Certain Node releases are specifically timed to align with newer V8 versions. However, it’s crucial not to confuse V8 with Node 8 as we delve into V8 version comparisons.

It’s easy to get tripped up, as “v8” is often used casually or even officially as shorthand for “version 8.” This might lead some to conflate “Node V8” or “Node.js V8” with “NodeJS 8.” To maintain clarity, we’ve steered clear of such ambiguity in this article: V8 will consistently refer to the engine, not the Node version.

V8 Release 5

Node 6 relies on V8 release 5 as its JavaScript engine. (It’s worth noting that the initial point releases of Node 8 also utilize V8 release 5, albeit a more recent V8 point release compared to Node 6.)

Compilers

V8 releases 5 and earlier incorporate two compilers:

  • Full-codegen: This JIT compiler prioritizes simplicity and speed but generates slower machine code.
  • Crankshaft: A more intricate JIT compiler that produces optimized machine code.
Threads

Behind the scenes, V8 employs multiple thread types:

  • The main thread handles code fetching, compilation, and execution.
  • Secondary threads focus on code execution while the main thread optimizes code.
  • The profiler thread identifies unperformant methods, which Crankshaft then optimizes.
  • Dedicated threads manage garbage collection.
Compilation Process

The process begins with the Full-codegen compiler executing the JavaScript code. During execution, the profiler thread gathers data to pinpoint methods for optimization by the engine. Concurrently, Crankshaft optimizes these methods on a separate thread.

Issues

This approach presents two primary challenges. Firstly, it introduces architectural complexity. Secondly, the compiled machine code demands a significant amount of memory—a memory footprint that remains constant regardless of code execution frequency. Even code executed only once consumes a substantial amount of memory.

V8 Release 6

Node 8.3 marks the debut of the V8 release 6 engine within the Node.js ecosystem.

In this release, the V8 team introduced Ignition and TurboFan to address the aforementioned issues. Ignition and TurboFan effectively replace Full-codegen and CrankShaft, respectively.

This revamped architecture boasts simplicity and reduced memory consumption.

Ignition takes on the role of compiling JavaScript code to bytecode rather than machine code, leading to substantial memory savings. Subsequently, TurboFan, the optimizing compiler, steps in to generate optimized machine code from this bytecode.

Performance Improvements Up Close

Let’s dive into specific areas where performance in Node 8.3+ has seen improvements compared to earlier Node versions.

Object Creation

Object creation in Node 8.3+ is about five times faster than in Node 6.

Function Size

The V8 engine considers various factors when determining whether to optimize a function, one of which is function size. Smaller functions are optimized, while longer functions are not.

Function Size: The Calculation Method

In the older V8 engine, Crankshaft relied on “character count” to assess function size. Consequently, whitespace and comments within a function could hinder its optimization prospects. Surprisingly, a comment could lead to a performance dip of around 10%.

Node 8.3+ changes the game. Irrelevant characters like whitespace and comments no longer impact function performance. Why?

Because the new TurboFan disregards character count when determining function size. Instead, it focuses on counting abstract syntax tree (AST) nodes, effectively considering only the actual function instructions. With Node 8.3+, you’re free to add comments and whitespace liberally without performance anxiety.

Array-ifying Arguments

In JavaScript, regular functions come equipped with an implicit Array-like argument object.

Array-like: Decoding the Term

The arguments object exhibits some array-like behavior. It possesses the length property but lacks the built-in methods found in Array, such as forEach and map.

Here’s how the arguments object operates:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
function foo() {
    console.log(arguments[0]);
    // Expected output: a

    console.log(arguments[1]);
    // Expected output: b

    console.log(arguments[2]);
    // Expected output: c
}

foo("a", "b", "c");

So, how can we transform the arguments object into a true array? By employing the concise Array.prototype.slice.call(arguments).

1
2
3
4
5
function test() {
    const r = Array.prototype.slice.call(arguments);
    console.log(r.map(num => num * 2));
}
test(1, 2, 3); // Expected output: [2, 4, 6]

However, Array.prototype.slice.call(arguments) negatively impacts performance across all Node versions. Consequently, copying the keys using a for loop proves more performant:

1
2
3
4
5
6
7
8
9
function test() {
    const r = [];
    for (index in arguments) {
        r.push(arguments[index]);
    }
    console.log(r.map(num => num * 2));
}

test(1, 2, 3); // Expected output [2, 4, 6]

Admittedly, the for loop can feel a bit clunky. While the spread operator is an option, it suffers from slow performance in Node 8.2 and earlier:

1
2
3
4
5
6
function test() {
    const r = [...arguments];
    console.log(r.map(num => num * 2));
}

test(1, 2, 3); // Expected output [2, 4, 6]

Node 8.3+ changes the game once again. The spread operator now executes significantly faster, even surpassing the for-loop in performance.

Partial Application (Currying) and Binding

Currying is a technique where a function taking multiple arguments is broken down into a sequence of functions, each accepting only one argument.

Let’s consider a basic add function. Its curried counterpart takes one argument, num1, and returns another function. This returned function accepts num2 as an argument and produces the sum of num1 and num2:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
function add(num1, num2) {
    return num1 + num2;
}
add(4, 6); // returns 10

function curriedAdd(num1) {
    return function(num2) {
        return num1 + num2;
    };
}
const add5 = curriedAdd(5);

add5(3); // returns 8

The bind method offers a more concise way to obtain a curried function.

1
2
3
4
5
function add(num1, num2) {
    return num1 + num2;
}
const add5 = add.bind(null, 5);
add5(3); // returns 8

While bind is undeniably useful, it suffers from slow performance in older Node versions. Thankfully, Node 8.3+ significantly boosts bind performance, allowing you to use it without performance concerns.

Experimental Observations

Several experiments were conducted have been conducted to provide a high-level comparison of performance between Node 6 and Node 8. It’s important to note that these experiments were performed on Node 8.0, meaning they don’t reflect the aforementioned improvements specific to Node 8.3+ resulting from the V8 release 6 upgrade.

The results revealed a 25% reduction in server-rendering time with Node 8 compared to Node 6. In large-scale projects, this translates to a potential reduction in server instances from 100 to 75—an impressive feat. Testing a suite of 500 tests also exhibited a 10% speed increase with Node 8. Similarly, Webpack builds were 7% faster. Overall, the findings consistently demonstrated a noticeable performance boost in Node 8.

Exploring Node 8 Features

Node 8’s enhancements extend beyond raw speed. It introduces a suite of handy new features, perhaps most importantly being async/await.

Async/Await: Node 8 Embraces a Modern Approach

Traditionally, callbacks and promises have been the go-to tools for managing asynchronous code in JavaScript. However, callbacks are infamous for leading to code that’s difficult to maintain, often resulting in the dreaded callback hell that has plagued the JavaScript community. Promises provided a much-needed respite from callback hell, but they still lacked the elegance of synchronous code.

Enter async/await, a modern approach that empowers you to write asynchronous code that reads like its synchronous counterpart. While async/await could be utilized in previous Node versions, it required external libraries and tools, such as Babel for preprocessing. Now, it’s readily available natively, right out of the box.

Let’s explore scenarios where async/await outshines conventional promises.

Conditionals: Taming Asynchronous Logic

Imagine fetching data and needing to determine whether an additional API call is necessary based on the received payload. The code snippet below demonstrates how this would be handled using the traditional “promises” approach.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
const request = () => {
    return getData().then(data => {
        if (!data.car) {
            return fetchForCar(data.id).then(carData => {
                console.log(carData);
                return carData;
            });
        } else {
            console.log(data);
            return data;
        }
    });
};

Even with just one extra conditional, the code already appears somewhat cluttered. Async/await comes to the rescue with reduced nesting:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
const request = async () => {
    const data = await getData();
    if (!data.car) {
        const carData = await fetchForCar(data);
        console.log(carData);
        return carData;
    } else {
        console.log(data);
        return data;
    }
};

Error Handling: A Unified Approach

Async/await provides the ability to handle both synchronous and asynchronous errors seamlessly within a try/catch block. Consider a scenario where you need to parse JSON data retrieved from an asynchronous API call. A single try/catch can gracefully handle both parsing errors and API errors.

1
2
3
4
5
6
7
const request = async () => {
    try {
        console.log(await getData());
    } catch (err) {
        console.log(err);
    }
};

Intermediate Values: Simplifying Chained Calls

What happens when a promise requires an argument that should be resolved from another promise? This situation necessitates the sequential execution of asynchronous calls.

Using traditional promises, you might encounter code like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
const request = () => {
    return fetchUserData()
        .then(userData => {
            return fetchCompanyData(userData);
        })
        .then(companyData => {
            return fetchRetiringPlan(userData, companyData);
        })
        .then(retiringPlan => {
            const retiringPlan = retiringPlan;
        });
};

Async/await truly shines in such cases where chained asynchronous calls are involved:

1
2
3
4
5
const request = async () => {
    const userData = await fetchUserData();
    const companyData = await fetchCompanyData(userData);
    const retiringPlan = await fetchRetiringPlan(userData, companyData);
};

Parallel Async: Embracing Concurrency

Let’s say you need to execute multiple asynchronous functions concurrently. In the following code, fetchCarData is called only after fetchHouseData resolves. While these functions are independent, they are processed sequentially, resulting in a two-second wait for both APIs to resolve—not ideal.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
function fetchHouseData() {
    return new Promise(resolve => setTimeout(() => resolve("Mansion"), 1000));
}

function fetchCarData() {
    return new Promise(resolve => setTimeout(() => resolve("Ferrari"), 1000));
}

async function action() {
    const house = await fetchHouseData(); // Wait one second
    const car = await fetchCarData(); // ...then wait another second.
    console.log(house, car, " in series");
}

action();

A superior approach involves processing these asynchronous calls in parallel. The code below illustrates how this can be achieved using async/await.

1
2
3
4
5
6
7
8
9
async function parallel() {
    houseDataPromise = fetchHouseData();
    carDataPromise = fetchCarData();
    const house = await houseDataPromise; // Wait one second for both
    const car = await carDataPromise;
    console.log(house, car, " in parallel");
}

parallel();

By embracing parallelism, the total wait time for both calls is reduced to just one second.

New Additions to the Core Library

Node 8 introduces a set of new functions to its core library.

File Copying Made Easy

Prior to Node 8, copying files involved creating two streams and piping data between them. The code snippet below demonstrates how data is piped from the read stream to the write stream. As you can see, the code seems unnecessarily verbose for such a basic operation like file copying.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
const fs = require('fs');
const rd = fs.createReadStream('sourceFile.txt');
rd.on('error', err => {
    console.log(err);
});
const wr = fs.createWriteStream('target.txt');
wr.on('error', err => {
    console.log(err);
});
wr.on('close', function(ex) {
    console.log('File Copied');
});
rd.pipe(wr);

Node 8 simplifies this process with the introduction of fs.copyFile and fs.copyFileSync, providing a much more straightforward way to copy files.

1
2
3
4
5
6
7
8
const fs = require("fs");
fs.copyFile("firstFile.txt", "secondFile.txt", err => {
	if (err) {
		console.log(err);
	} else {
		console.log("File copied");
	}
});

promisify and callbackify: Bridging Different Styles

util.promisify serves as a bridge between traditional callback-based functions and promises. It transforms a regular function into an async function. It’s important to note that the input function should adhere to the common Node.js callback style, accepting a callback as the last argument, typically in the form of (error, payload) => { ... }.

1
2
3
4
5
6
7
8
9
const { promisify } = require('util');
const fs = require('fs');
const readFilePromisified = promisify(fs.readFile);

const file_path = process.argv[2];

readFilePromisified(file_path)
    .then((text) => console.log(text))
    .catch((err) => console.log(err));

As demonstrated, util.promisify successfully converts fs.readFile into an async function.

On the flip side, Node.js also provides util.callbackify. As its name suggests, util.callbackify performs the opposite operation of util.promisify. It converts an async function into a function following the Node.js callback style.

destroy: Taking Control of Readable and Writable Streams

Node 8 introduces the destroy function, a documented method for destroying/closing/aborting both readable and writable streams:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
const fs = require('fs');
const file = fs.createWriteStream('./big.txt');

file.on('error', errors => {
    console.log(errors);
});

file.write(`New text.\n`);

file.destroy(['First Error', 'Second Error']);

Executing this code will result in the creation of a new file named big.txt (or overwrite the existing one) containing the text New text..

In Node 8, the Readable.destroy and Writeable.destroy functions emit a close event and an optional error event. It’s crucial to understand that calling destroy doesn’t necessarily indicate an error occurred.

Spread Operator: Expanding Its Reach

While the spread operator (represented by ...) was present in Node 6, its functionality was limited to arrays and other iterables:

1
2
3
const arr1 = [1,2,3,4,5,6]
const arr2 = [...arr1, 9]
console.log(arr2) // expected output: [1,2,3,4,5,6,9]

Node 8 expands the spread operator’s capabilities, allowing its use with objects as well:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
const userCarData = {
    type: 'ferrari',
    color: 'red'
};

const userSettingsData = {
    lastLoggedIn: '12/03/2019',
    featuresPlan: 'premium'
};

const userData = {
    ...userCarData,
    name: 'Youssef',
    ...userSettingsData
};
console.log(userData);
/* Expected output:
    {
        type: 'ferrari',
        color: 'red',
        name: 'Youssef',
        lastLoggedIn: '12/03/2019',
        featuresPlan: 'premium'
    }
*/

Experimental Territory: Features in Node 8 LTS

It’s important to approach experimental features with caution. These features lack stability, might face deprecation, and could undergo updates over time. Using them in production environments is strongly discouraged until they transition to a stable state.

Async Hooks: Unveiling Asynchronous Insights

Async hooks provide a mechanism to track the lifecycle of asynchronous resources created within Node through an API.

Before diving into async hooks, it’s crucial to have a solid grasp of the event loop. This video can be a valuable resource in this regard. Async hooks prove particularly useful for debugging asynchronous functions, offering several applications, including enhanced error stack traces for async functions.

Consider the code snippet below. Note that console.log, being an async function, cannot be directly used within async hooks. Instead, fs.writeSync is employed.

1
2
3
4
5
const asyncHooks = require('async_hooks');
const fs = require('fs');
const init = (asyncId, type, triggerId) => fs.writeSync(1, `${type} \n`);
const asyncHook = asyncHooks.createHook({ init });
asyncHook.enable();

To gain a deeper understanding of async hooks, this video provides valuable insights. For a Node.js-specific guide, this article effectively demystifies async hooks through a practical application example.

ES6 Modules: A New Era for Node 8

Node 8 embraces ES6 modules, allowing you to write code using this modern syntax:

1
import { UtilityService } from './utility_service';

To leverage ES6 modules in Node 8, a couple of steps are required.

  1. The --experimental-modules flag must be added to the command line.
  2. File extensions need to be renamed from .js to .mjs.

HTTP/2: The Evolution of Web Communication

HTTP/2, available in experimental mode from Node 8.4+ supports it natively, marks a significant update to the somewhat static HTTP protocol. It surpasses its predecessor, HTTP/1.1, in speed, security, and efficiency. It’s worth noting that Google recommends that you use it as well. Let’s delve into its key features.

Multiplexing: Concurrent Responses

In HTTP/1.1, servers were limited to sending one response per connection at a time. HTTP/2 breaks free from this constraint, enabling servers to send multiple responses concurrently over a single connection.

Server Push: Anticipating Client Needs

HTTP/2 empowers servers to push multiple responses for a single client request—a feature with several benefits. Let’s consider a web application as an example.

Traditionally:

  1. The client initiates a request for an HTML document.
  2. Upon receiving the HTML, the client identifies required resources.
  3. The client then sends separate HTTP requests for each resource. For instance, individual requests for each JS and CSS file mentioned in the document.

Server push leverages the server’s knowledge of these required resources. Instead of waiting for individual requests, the server proactively pushes these resources to the client. In our web application example, the server would push all necessary resources immediately after the client requests the initial HTML document. This proactive approach significantly reduces latency.

Prioritization: Optimizing Resource Allocation

Clients gain the ability to define a prioritization scheme, indicating the relative importance of each requested response. Servers can then leverage this scheme to prioritize resource allocation—memory, CPU, bandwidth, etc.—accordingly.

Breaking Old Habits: HTTP/2 Renders Workarounds Obsolete

The limitations of HTTP/1.1, particularly the lack of multiplexing, led to various optimizations and workarounds aimed at masking slow speeds and file loading. Unfortunately, these techniques often resulted in increased RAM consumption and delayed rendering:

  • Domain sharding: Utilizing multiple subdomains to distribute connections and achieve parallel processing.
  • File concatenation: Combining CSS and JavaScript files to minimize the number of requests.
  • Sprite maps: Merging multiple image files to reduce HTTP requests.
  • Inlining: Directly embedding CSS and JavaScript within HTML to eliminate additional connections.

HTTP/2 renders these workarounds obsolete. You can now focus on writing clean code without resorting to such optimizations.

Putting HTTP/2 into Practice

Most modern browsers only support HTTP/2 over secure SSL connections. This article can guide you through setting up a self-signed certificate. Once you have the generated .crt and .key files, place them in a directory named ssl. Next, create a file named server.js and add the following code:

To enable HTTP/2, remember to use the --expose-http2 flag when running your code. For our example, the run command would be node server.js --expose-http2.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
const http2 = require('http2');
const path = require('path');
const fs = require('fs');

const PORT = 3000;

const secureServerOptions = {
    cert: fs.readFileSync(path.join(__dirname, './ssl/server.crt')),
    key: fs.readFileSync(path.join(__dirname, './ssl/server.key'))
};

const server = http2.createSecureServer(secureServerOptions, (req, res) => {
    res.statusCode = 200;
    res.end('Hello from Toptal');
});

server.listen(
    PORT,
    err =>
        err
            ? console.error(err)
            : console.log(`Server listening to port ${PORT}`)
);

It’s important to remember that Node 8, Node 9, Node 10, and subsequent versions continue to support HTTP 1.1. The official Node.js documentation on a standard HTTP transaction will remain relevant for the foreseeable future. If you’re ready to embrace HTTP/2, this Node.js guide offers a deeper dive into its intricacies.

Node.js 8: The Verdict

Node 8 brought a wave of performance improvements and exciting features like async/await, HTTP/2, and more. End-to-end experiments have conclusively shown that Node 8 outperforms Node 6 by approximately 25%. This translates to tangible benefits, including substantial cost savings. For new projects, Node 8 is a resounding yes!

But what about existing projects? Should you embark on a Node upgrade journey?

The answer depends on the extent of code changes required. This document provides a comprehensive list of breaking changes between Node 6 and Node 8, which can help you assess the impact. As a best practice, always reinstall all npm packages for your project using the latest Node 8 version to avoid common issues. Additionally, ensure consistency by using the same Node.js version across development machines and production servers. Good luck with your Node 8 endeavors!

Licensed under CC BY-NC-SA 4.0