Is It Time to Use Node 8?
Node 8 has brought significant performance and feature upgrades. Should you use it on new projects? Is it worth upgrading existing codebases? In this article, Toptal Freelance JavaScript Developer Youssef Sherif gives a tour of Node 8’s biggest changes and what they mean for your project.
Node 8 has brought significant performance and feature upgrades. Should you use it on new projects? Is it worth upgrading existing codebases? In this article, Toptal Freelance JavaScript Developer Youssef Sherif gives a tour of Node 8’s biggest changes and what they mean for your project.
Youssef has utilized React, Angular, NodeJS, and Python to build sophisticated web apps, API services, and machine learning applications.
Expertise
PREVIOUSLY AT
Node 8 is out! In fact, Node 8 has now been out long enough to see some solid real-world usage. It came with a fast new V8 engine and with new features, including async/await, HTTP/2, and async hooks. But is it ready for your project? Let’s find out!
Editor’s note: You’re likely aware that Node 10 (code-named Dubnium) is out, too. We’re choosing to focus on Node 8 (Carbon) for two reasons: (1) Node 10 is just now entering its long-term support (LTS) phase, and (2) Node 8 marked a more significant iteration than Node 10 did.
Performance in Node 8 LTS
We’ll start by taking a look at the performance improvements and new features of this remarkable release. One major area of improvement is in Node’s JavaScript engine.
What exactly is a JavaScript engine, anyway?
A JavaScript engine executes and optimizes code. It could be a standard interpreter or a just-in-time (JIT) compiler that compiles JavaScript to bytecode. The JS engines used by Node.js are all JIT compilers, not interpreters.
The V8 Engine
Node.js has used Google’s Chrome V8 JavaScript engine, or simply V8, since the beginning. Some Node releases are used to sync with a newer version of V8. But take care not to confuse V8 with Node 8 as we compare V8 versions here.
This is easy to trip over, since in software contexts we often use “v8” as slang or even official short-form for “version 8,” so some might conflate “Node V8” or “Node.js V8” with “NodeJS 8,” but we’ve avoided this throughout this article to help keep things clear: V8 will always mean the engine, not the version of Node.
V8 Release 5
Node 6 uses V8 release 5 as its JavaScript engine. (The first few point releases of Node 8 also use V8 release 5, but they use a newer V8 point release than Node 6 did.)
Compilers
V8 releases 5 and earlier have two compilers:
- Full-codegen is a simple and quick JIT compiler but produces slow machine code.
- Crankshaft is a complex JIT compiler that produces optimized machine code.
Threads
Deep down, V8 uses more than one type of thread:
- The main thread fetches code, compiles it, then executes it.
- Secondary threads execute code while the main thread is optimizing code.
- The profiler thread informs the runtime about unperformant methods. Crankshaft then optimizes these methods.
- Other threads manage garbage collection.
Compilation Process
First, the Full-codegen compiler executes the JavaScript code. While the code is being executed, the profiler thread gathers data to determine which methods the engine will optimize. On another thread, Crankshaft optimizes these methods.
Issues
The approach mentioned above has two main problems. First, it is architecturally complex. Second, the compiled machine code consumes much more memory. The amount of memory consumed is independent of the number of times the code is executed. Even code that runs only once also takes up a significant amount of memory.
V8 Release 6
The first Node version to use the V8 release 6 engine is Node 8.3.
In release 6, the V8 team built Ignition and TurboFan to mitigate these issues. Ignition and TurboFan replace Full-codegen and CrankShaft, respectively.
The new architecture is more straightforward and consumes less memory.
Ignition compiles JavaScript code to bytecode instead of machine code, saving much memory. Afterward, TurboFan, the optimizing compiler, generates optimized machine code from this bytecode.
Specific Performance Improvements
Let’s go through the areas where the performance in Node 8.3+ changed relative to older Node versions.
Creating Objects
Creating objects is about five times faster in Node 8.3+ than in Node 6.
Function Size
The V8 engine decides whether a function should be optimized based on several factors. One factor is function size. Small functions are optimized, while long functions are not.
How Is Function Size Calculated?
Crankshaft in the old V8 engine uses “character count” to determine function size. Whitespace and comments in a function reduce the chances of it being optimized. I know this might surprise you, but back then, a comment could reduce speed by about 10%.
In Node 8.3+, irrelevant characters such as whitespace and comments do not harm function performance. Why not?
Because the new TurboFan does not count characters to determine function size. Instead, it counts abstract syntax tree (AST) nodes, so effectively it considers only actual function instructions. Using Node 8.3+, you can add comments and whitespace as much as you want.
Array
-ifying Arguments
Regular functions in JavaScript carry an implicit Array
-like argument
object.
What Does Array
-like Mean?
The arguments
object acts somewhat like an array. It has the length
property but lacks Array
’s built-in methods like forEach
and map
.
Here’s how the arguments
object works:
function foo() {
console.log(arguments[0]);
// Expected output: a
console.log(arguments[1]);
// Expected output: b
console.log(arguments[2]);
// Expected output: c
}
foo("a", "b", "c");
So how could we convert the arguments
object to an array? By using the terse Array.prototype.slice.call(arguments)
.
function test() {
const r = Array.prototype.slice.call(arguments);
console.log(r.map(num => num * 2));
}
test(1, 2, 3); // Expected output: [2, 4, 6]
Array.prototype.slice.call(arguments)
impairs performance in all Node versions. Therefore, copying the keys via a for
loop performs better:
function test() {
const r = [];
for (index in arguments) {
r.push(arguments[index]);
}
console.log(r.map(num => num * 2));
}
test(1, 2, 3); // Expected output [2, 4, 6]
The for
loop is a bit cumbersome, isn’t it? We could use the spread operator, but it is slow in Node 8.2 and down:
function test() {
const r = [...arguments];
console.log(r.map(num => num * 2));
}
test(1, 2, 3); // Expected output [2, 4, 6]
The situation has changed in Node 8.3+. Now the spread executes much faster, even faster than a for-loop.
Partial Application (Currying) and Binding
Currying is breaking down a function that takes multiple arguments into a series of functions where each new function takes only one argument.
Let’s say we have a simple add
function. The curried version of this function takes one argument, num1
. It returns a function that takes another argument num2
and returns the sum of num1
and num2
:
function add(num1, num2) {
return num1 + num2;
}
add(4, 6); // returns 10
function curriedAdd(num1) {
return function(num2) {
return num1 + num2;
};
}
const add5 = curriedAdd(5);
add5(3); // returns 8
The bind
method returns a curried function with a terser syntax.
function add(num1, num2) {
return num1 + num2;
}
const add5 = add.bind(null, 5);
add5(3); // returns 8
So bind
is incredible, but it is slow in older Node versions. In Node 8.3+, bind
is much faster and you can use it without worrying about any performance hits.
Experiments
Several experiments have been conducted to compare the performance of Node 6 to Node 8 on a high level. Note that these were conducted on Node 8.0 so they don’t include the improvements mentioned above that are specific to Node 8.3+ thanks to its V8 release 6 upgrade.
Server-rendering time in Node 8 was 25% less than in Node 6. In large projects, the number of server instances could be reduced from 100 to 75. This is astonishing. Testing a suite of 500 tests in Node 8 was 10% faster. Webpack builds were 7% faster. In general, results showed a noticeable performance boost in Node 8.
Node 8 Features
Speed wasn’t the only improvement in Node 8. It also brought several handy new features—perhaps most importantly, async/await.
Async/Await in Node 8
Callbacks and promises are usually used to handle asynchronous code in JavaScript. Callbacks are notorious for producing unmaintainable code. They’ve caused mayhem (known specifically as callback hell) in the JavaScript community. Promises rescued us from callback hell for a long time, but they still lacked the cleanliness of synchronous code. Async/await is a modern approach that allows you to write asynchronous code that looks like synchronous code.
And while async/await could be used in previous Node versions, it required external libraries and tools—for example, extra preprocessing via Babel. Now it’s available natively, out of the box.
I’ll talk about some cases where async/await is superior to conventional promises.
Conditionals
Imagine you are fetching data and you will determine whether a new API call is needed based on the payload. Have a look at the code below to see how this is done via the “conventional promises” approach.
const request = () => {
return getData().then(data => {
if (!data.car) {
return fetchForCar(data.id).then(carData => {
console.log(carData);
return carData;
});
} else {
console.log(data);
return data;
}
});
};
As you can see, the code above looks messy already, just from one extra conditional. Async/await involves less nesting:
const request = async () => {
const data = await getData();
if (!data.car) {
const carData = await fetchForCar(data);
console.log(carData);
return carData;
} else {
console.log(data);
return data;
}
};
Error Handling
Async/await grants you access to handle both synchronous and asynchronous errors in try/catch. Let’s say you want to parse JSON coming from an asynchronous API call. A single try/catch could handle both parsing errors and API errors.
const request = async () => {
try {
console.log(await getData());
} catch (err) {
console.log(err);
}
};
Intermediate Values
What if a promise needs an argument that should be resolved from another promise? This means that asynchronous calls are required to be performed in series.
Using conventional promises, you might end up with code like this:
const request = () => {
return fetchUserData()
.then(userData => {
return fetchCompanyData(userData);
})
.then(companyData => {
return fetchRetiringPlan(userData, companyData);
})
.then(retiringPlan => {
const retiringPlan = retiringPlan;
});
};
Async/await shines in this case, where chained asynchronous calls are needed:
const request = async () => {
const userData = await fetchUserData();
const companyData = await fetchCompanyData(userData);
const retiringPlan = await fetchRetiringPlan(userData, companyData);
};
Async in Parallel
What if you want to call more than one asynchronous function in parallel? In the code below, we’ll wait for fetchHouseData
to resolve, then call fetchCarData
. Although each of these is independent of the other, they are processed sequentially. You will wait two seconds for both APIs to resolve. This is not good.
function fetchHouseData() {
return new Promise(resolve => setTimeout(() => resolve("Mansion"), 1000));
}
function fetchCarData() {
return new Promise(resolve => setTimeout(() => resolve("Ferrari"), 1000));
}
async function action() {
const house = await fetchHouseData(); // Wait one second
const car = await fetchCarData(); // ...then wait another second.
console.log(house, car, " in series");
}
action();
A better approach is to process the asynchronous calls in parallel. Check the code below to get an idea of how this is achieved in async/await.
async function parallel() {
houseDataPromise = fetchHouseData();
carDataPromise = fetchCarData();
const house = await houseDataPromise; // Wait one second for both
const car = await carDataPromise;
console.log(house, car, " in parallel");
}
parallel();
Processing these calls in parallel has you wait for only one second for both calls.
New Core Library Functions
Node 8 also brings some new core functions.
Copy Files
Before Node 8, to copy files, we used to create two streams and pipe data from one to the other. The code below shows how the read stream pipes data to the write stream. As you can see, the code is cluttered for such a simple action as copying a file.
const fs = require('fs');
const rd = fs.createReadStream('sourceFile.txt');
rd.on('error', err => {
console.log(err);
});
const wr = fs.createWriteStream('target.txt');
wr.on('error', err => {
console.log(err);
});
wr.on('close', function(ex) {
console.log('File Copied');
});
rd.pipe(wr);
In Node 8 fs.copyFile
and fs.copyFileSync
are new approaches to copying files with much less hassle.
const fs = require("fs");
fs.copyFile("firstFile.txt", "secondFile.txt", err => {
if (err) {
console.log(err);
} else {
console.log("File copied");
}
});
Promisify and Callbackify
util.promisify
converts a regular function into an async function. Note that the inputted function should follow the common Node.js callback style. It should take a callback as the last argument, i.e., (error, payload) => { ... }
.
const { promisify } = require('util');
const fs = require('fs');
const readFilePromisified = promisify(fs.readFile);
const file_path = process.argv[2];
readFilePromisified(file_path)
.then((text) => console.log(text))
.catch((err) => console.log(err));
As you could see, util.promisify
has converted fs.readFile
to an async function.
On the other hand, Node.js comes with util.callbackify
. util.callbackify
is the opposite of util.promisify
: It converts an async function to a Node.js callback style function.
destroy
Function for Readables and Writables
The destroy
function in Node 8 is a documented way to destroy/close/abort a readable or writable stream:
const fs = require('fs');
const file = fs.createWriteStream('./big.txt');
file.on('error', errors => {
console.log(errors);
});
file.write(`New text.\n`);
file.destroy(['First Error', 'Second Error']);
The code above results in a creating new file named big.txt
(if it doesn’t already exist) with the text New text.
.
The Readable.destroy
and Writeable.destroy
functions in Node 8 emit a close
event and an optional error
event—destroy
doesn’t necessarily mean anything went wrong.
Spread Operator
The spread operator (aka ...
) worked in Node 6, but only with arrays and other iterables:
const arr1 = [1,2,3,4,5,6]
const arr2 = [...arr1, 9]
console.log(arr2) // expected output: [1,2,3,4,5,6,9]
In Node 8, objects also can use the spread operator:
const userCarData = {
type: 'ferrari',
color: 'red'
};
const userSettingsData = {
lastLoggedIn: '12/03/2019',
featuresPlan: 'premium'
};
const userData = {
...userCarData,
name: 'Youssef',
...userSettingsData
};
console.log(userData);
/* Expected output:
{
type: 'ferrari',
color: 'red',
name: 'Youssef',
lastLoggedIn: '12/03/2019',
featuresPlan: 'premium'
}
*/
Experimental Features in Node 8 LTS
Experimental features are not stable, could get deprecated, and could get updated with time. Do not use any of these features in production until they become stable.
Async Hooks
Async hooks track the lifetime of asynchronous resources created inside Node through an API.
Make sure you understand the event loop before going further with async hooks. This video might help. Async hooks are useful for debugging async functions. They have several applications; one of them is error stack traces for async functions.
Have a look at the code below. Notice that console.log
is an async function. Thus it cannot be used inside async hooks. fs.writeSync
is used instead.
const asyncHooks = require('async_hooks');
const fs = require('fs');
const init = (asyncId, type, triggerId) => fs.writeSync(1, `${type} \n`);
const asyncHook = asyncHooks.createHook({ init });
asyncHook.enable();
Watch this video to know more about async hooks. In terms of a Node.js guide specifically, this article helps to demystify async hooks through an illustrative application.
ES6 Modules in Node 8
Node 8 now supports ES6 modules, enabling you to use this syntax:
import { UtilityService } from './utility_service';
To use ES6 modules in Node 8, you need to do the following.
- Add the
--experimental-modules
flag to the command line - Rename file extensions from
.js
to.mjs
HTTP/2
HTTP/2 is the latest update to the not-often-updated HTTP protocol, and Node 8.4+ supports it natively in experimental mode. It’s faster, more secure, and more efficient than its predecessor, HTTP/1.1. And Google recommends that you use it. But what else does it do?
Multiplexing
In HTTP/1.1, the server could only send one response per connection at a time. In HTTP/2, the server can send more than one response in parallel.
Server Push
The server can push multiple responses for a single client request. Why is this beneficial? Take a web application as an example. Conventionally,
- The client requests an HTML document.
- The client discovers resources needed from the HTML document.
- The client sends an HTTP request for each required resource. For example, the client sends an HTTP request for each JS and CSS resource mentioned in the document.
The server-push feature makes use of the fact that the server already knows about all those resources. The server pushes those resources to the client. So for the web application example, the server pushes all resources after the client requests the initial document. This reduces latency.
Prioritization
The client can set a prioritization scheme to determine how important each required response is. The server can then use this scheme to prioritize the allocation of memory, CPU, bandwidth, and other resources.
Shedding Old Bad Habits
Since HTTP/1.1 did not allow multiplexing, several optimizations and workarounds are used to cover up the slow speed and file loading. Unfortunately, these techniques cause an increase in RAM consumption and delayed rendering:
- Domain sharding: Multiple subdomains were used so that connections are dispersed and are processed in parallel.
- Combining CSS and JavaScript files to reduce the number of requests.
- Sprite maps: Combining image files to reduce HTTP requests.
- Inlining: CSS and JavaScript are placed in the HTML directly to reduce the number of connections.
Now with HTTP/2, you can forget about these techniques and focus on your code.
But How Do You Use HTTP/2?
Most browsers support HTTP/2 only through a secured SSL connection. This article can help you configure a self-signed certificate.
Add the generated .crt
file and .key
file in a directory called ssl
. Then, add the code below to a file named server.js
.
Remember to use the --expose-http2
flag in the command line to enable this feature. I.e. the run command for our example is node server.js --expose-http2
.
const http2 = require('http2');
const path = require('path');
const fs = require('fs');
const PORT = 3000;
const secureServerOptions = {
cert: fs.readFileSync(path.join(__dirname, './ssl/server.crt')),
key: fs.readFileSync(path.join(__dirname, './ssl/server.key'))
};
const server = http2.createSecureServer(secureServerOptions, (req, res) => {
res.statusCode = 200;
res.end('Hello from Toptal');
});
server.listen(
PORT,
err =>
err
? console.error(err)
: console.log(`Server listening to port ${PORT}`)
);
Of course, Node 8, Node 9, Node 10, etc. still support the old HTTP 1.1—the official Node.js documentation on a standard HTTP transaction won’t be stale for a long time. But if you want to use HTTP/2, you can go deeper with this Node.js guide.
So, Should I Use Node.js 8 in the End?
Node 8 arrived with performance improvements and with new features like async/await, HTTP/2, and others. End-to-end experiments have shown that Node 8 is about 25% faster than Node 6. This leads to substantial cost savings. So for greenfield projects, absolutely! But for existing projects, should you update Node?
It depends on whether you would need to change much of your existing code. This document lists all Node 8 breaking changes if you’re coming from Node 6. Remember to avoid common issues by reinstalling all of your project’s npm
packages using the latest Node 8 version. Also, always use the same Node.js version on development machines as on production servers. Best of luck!
Understanding the basics
What is Node.js?
Node.js is an open-source JavaScript runtime that runs on various platforms like Windows, Linux, and Mac OS X. Node.js is built on Chrome’s V8 JavaScript engine.
What is Node.js used for?
Node.js is used for running JavaScript on the server. Like other runtimes and languages, Node.js can handle files on the server, communicate with databases, generate dynamic page content, build RESTful API services, etc.
Why use Node.js?
Node.js is superior in handling real-time two-way communication and runs asynchronous code out of the box. This helps create highly scalable, high-performance applications. Also, front-end developers who already know JavaScript do not need to learn another language.
What does LTS stand for?
LTS stands for “long-term support.” In this case, Node.js 8 is in the LTS phase, meaning it’s among the most stable version choices.
Yoosif Sherif
Ajman, United Arab Emirates
Member since May 21, 2018
About the author
Youssef has utilized React, Angular, NodeJS, and Python to build sophisticated web apps, API services, and machine learning applications.
Expertise
PREVIOUSLY AT