Skip to main content
All Posts By

OpenJS Foundation

Node.js 18 Released With Improved Security, Fetch API, and Next-10 Strategic Initiatives

By Blog, Node.js, Project Update

Node.js 18 is available now! It adds multiple key features of enterprise and small- to medium-sized enterprises including increased security support, the Fetch API, and it is part of delivering on the larger Next-10 strategic initiative within Node.js that is pushing forward key priorities including modernizing HTTP and keeping Node.js on the forefront of web development. 

As part of increased security support, Node.js has been announced as the first pilot open source community to be supported by OpenSSF’s Alpha-Omega Project. Alpha-Omega is committing $300k to bolster the Node.js security team and vulnerability remediation efforts through the rest of 2022, with a focus on supporting better open source security standards and practices.

“The Node.js team continues to do fantastic work. The open governance structure for Node.js has led to tangible improvements in security and forward-thinking planning, and the main features of Node.js 18 will be highly valuable to enterprises of all sizes,” said Robin Ginn, OpenJS Foundation executive director. “Whether you’re a new user or already have Node.js broadly implemented, now’s a good time to install and test Node.js 18.”

Following its long-established release schedule, Node.js 18 is a Current release, which means it’s the right time for testing by enterprises, before being suitable for production usage when it is promoted to long-term support (LTS) in October 2022.

“The Node.js project contributors and collaborators continue to do an excellent job, and I want to thank them all. We continue to improve and grow, and I believe Node.js is a real open source success story,” said Bethany Griggs, Node.js Technical Steering Committee member, and Senior Software Engineer at Red Hat. “As always, current releases, like Node.js 18, are the perfect time to test in your own unique development environment. If you’re a Node.js user, please try out Node.js 18 and give us feedback. Your feedback directly contributes to our ability to move new features into stable releases more quickly.” 

For comprehensive information on specific Node.js features, see the Node.js team release announcement written by the Node.js project contributors: LINK

There are three key reasons to evaluate and upgrade to Node.js 18: Security, APIs, Future Planning.

Security

This is the first version that will be later promoted to LTS with OpenSSL 3.0. OpenSSL 3.0 is a major new stable version of the popular and widely used cryptography library. OpenSSL contains an open-source implementation of the SSL and TLS protocols, which provide the ability to secure communications across networks. Among other key features, OpenSSL 3.0 contains a FIPS Module that has been submitted for validation. The Federal Information Processing Standards (FIPS) are a set of requirements enforced by the US government which govern cryptographic usage in the public sector. This is a key step forward in the cryptographic support in Node.js.

The Node.js project follows a well planned security release process, with regular outbound communications and more. In the last year, Node.js has formalized rotations around security. The commitment to take slots in the security release steward rotation is made by companies in order to ensure individuals who act as security stewards have the support and recognition from their employer to be able to prioritize security releases. 

APIs

Node.js 18 is adding even tighter synergy between front-end and back-end APIs. One of the key premises of Node.js is that JavaScript skills can be applied to the back-end. With Node.js 18, Fetch is globally available by default. The Fetch API provides an interface for fetching resources including across networks. It will seem familiar to anyone who has used XMLHttpRequest, but the new API provides a more powerful and flexible feature set.

“Node.js 18 will enable the Fetch API as a default. It’s been available since Node.js 17, but this moves forward Node.js application development, and it’s exciting to be a part of the process of improving Node.js in key fundamental areas,” said Michaël Zasso, Scientific research software engineer and co-founder at Zakodium, member of the Node.js Technical Steering Committee. “I would like to thank multiple team members and contributors, and in particular I would like to thank users who push us and support us. Thank you!”

XMLHttpRequest has been used by web developers enabling ajax and a whole new kind of interactive exposure. However, it has been slowly succeeded by Fetch API. Fetch API is Promise based, providing a cleaner and more concise syntax.

Future Planning

The Next-10 effort has elevated technical priorities which have led to discussions around modernizing http. The purpose of the Next-10 project is to work collaboratively on the strategic directions for the next 10 years of Node.js. Fetch API is one direct result of this process. The full Next-10 repository is available here: https://github.com/nodejs/next-10 

Node.js Training and Certification

The OpenJS Node.js Services Developer (JSNSD) and OpenJS Node.js Application Developer (JSNAD) certifications are available now. Node.js training courses are available to help you prepare for the exams: Node.js Application Development (LFW211) and the Node.js Services Development (LFW212). Discounts are available to members!

OpenJS Resources

Click here to learn more about how you could be a part of the OpenJS Foundation, and view these additional resources:

About OpenJS Foundation

The OpenJS Foundation is committed to supporting the healthy growth of the JavaScript ecosystem and web technologies by providing a neutral organization to host and sustain projects and collaboratively fund activities for the benefit of the community at large. The OpenJS Foundation is currently home to 39 open source JavaScript projects, including Appium, Dojo, Electron, jQuery, Node.js, and webpack. It is supported by 30 corporate and end-user members, including GoDaddy, Google, IBM, Intel, Joyent, Microsoft, and Netflix. These members recognize the interconnected nature of the JavaScript ecosystem and the importance of providing a central home for projects which represent significant shared value. 

About Linux Foundation
Founded in 2000, the Linux Foundation is supported by more than 1000 members and is the world’s leading home for collaboration on open source software, open standards, and open hardware. Linux Foundation projects like Linux, Kubernetes, Node.js, and more are considered critical to developing the world’s most important infrastructure. Its development methodology leverages established best practices and addresses the needs of contributors, users, and solution providers to create sustainable models for open collaboration. For more information, please visit their website.

Insights from the 2022 Fastify Survey

By Blog, Fastify, Project Updates

Survey finds web framework for Node.js is fast and performant

After the challenges that so many have faced the last few years, we really appreciate that so many Fastify users found the time to complete this survey. Fastify is a web framework for Node.js to help increase web performance, and is hosted by the OpenJS Foundation. Our community is everything, and getting feedback like this helps us to maintain and improve Fastify.

We had 108 respondents participate in this year’s survey, and surprising to no one,  the majority work in software or development, or are studying in those areas.

Most respondents found Fastify through GitHub, where Fastify’s repos reside. It is encouraging to see that Fastify is also being discovered through other mediums, which indicates a growing awareness of Fastify that will hopefully lead to more adoption and activity around the framework!

Where Fastify Excels

As the name of the framework implies, Fastify is fast and performant. Respondents overwhelmingly cited this as the reason why they enjoy using Fastify in their projects. Other positives included:

  • Active development and community
  • Documentation
  • Ease-of-use and simplicity
  • Growing plugin ecosystem
  • Support for modern JavaScript features
  • TypeScript support
  • Matteo Collina’s involvement as lead maintainer

These helpful features and benefits have been the result of years of work by maintainers and contributors alike, so we’d like to extend a big thank you to everyone who has been involved with Fastify! Also, thanks to Matteo Collina (Twitter: @matteocollina) for being so charming that he brings all the users to the yard.

Respondents also indicated a high level of satisfaction with Fastify across the board:

Where Fastify Can Improve

We know that “perfection isn’t attainable but if we pursue it we can attain excellence.”  So we asked what improvements users would like to see in future releases.

Requests for additional TypeScript support far outweighed any other, with requests for more examples and clarity in documentation ranking second in priority. 75% of respondents stated they use TypeScript, so this is not surprising!

V4 of Fastify resolves some of these grievances, with new automatic types for query strings and bodies, fixes for missing types, typed decorators, and more. One of the responses requested support for hot reloading, which is currently being explored through tooling like restartable!

The overall sentiment of this feedback has been positive. As Fastify progresses into V4 and beyond we hope that we can remove pain points in Fastify that users have highlighted in their responses, whilst continuing to provide the high-performance users have come to expect!

Many thanks to Alix Robinson and Matteo Collina for preparing the survey. This article was brought to you by Frazer Smith.

OpenJS In Action: How Wix Applied Multi-threading to Node.js and Cut Thousands of SSR Pods and Money

By Blog, Case Study, OpenJS In Action
Author: Guy Treger, Sr. Software Engineer, Viewer-Server team, Wix

Background:

In Wix, as part of our sites’ Server-Side-Rendering architecture, we build and maintain the heavily used Server-Side-Rendering-Execution platform (aka SSRE). 

SSRE is a Node.js based multipurpose code execution platform, that’s used for executing React.js code written by front-end developers all across the company. Most often, these pieces of JS code perform CPU-intensive operations that simulate activities related to site rendering in the browser.

Pain: 

SSRE has reached a total traffic of about 1 million RPM, requiring at times largely more than accepted  production Kubernetes pods to serve it properly. 

This made us face an inherent painful mismatch:
On one side, the nature of Node.js – an environment best-suited for running I/O-bound operations on its single-threaded event loop. On the other, the extremely high traffic of CPU oriented tasks that we had to handle as part rendering sites on the server.

The naive solution we started with clearly proved inefficient, causing some ever growing pains in Wix’s server and infrastructure, such as having to manage tens of thousands of production kubernetes pods.

Solution: 

We had to change something. The way things were, all of our heavy CPU work was done by a single thread in Node.js. The straightforward thing that comes to mind is: offload the work to other compute units (processes/threads) that can run parallely on hardware that includes multiple CPU cores.

Node.js already offers multi-processing capabilities, but for our needs this was an overkill. We needed a lighter solution that would introduce less overhead, both in terms of resources required and in overall maintenance and orchestration.

Only recently, Node.js has introduced what it calls worker-threads. This feature has become Stable in the v14 (LTS) released in Oct 2020.

From the Node.js Worker-Threads documentation:

The worker_threads module enables the use of threads that execute JavaScript in parallel. To access it:

const worker = require(‘worker_threads’);

Workers (threads) are useful for performing CPU-intensive JavaScript operations. They do not help much with I/O-intensive work. The Node.js built-in asynchronous I/O operations are more efficient than Workers can be.

Unlike child_process or cluster, worker_threads can share memory.

So Node.js offers a native support for threading that we could use, but since it’s a pretty new thing around, it still lacks a bit in maturity and is not super smooth to use in your production-grade code of a critical application.

What we were mainly missing was:

  1. Task-pool capabilities
    What Node.js offers?
    One can manually spawn some Worker threads and maintain their lifecycle themselves. E.g.:
const { Worker } = require(“worker_threads”);

//Create a new worker
const worker = new Worker(“./worker.js”, {workerData: {….}});

worker.on(“exit”, exitCode => {
  console.log(exitCode);
});


We were reluctant to spawn our Workers manually, make sure there’s enough of them at every given time, re-create them when they die, implement different timeouts around their usage and more stuff that’s generally available for a multithreaded application.

  1. RPC-like inter-thread communication
    What Node.js offers?
    Out-of-the-box, threads can communicate between themselves (e.g. the main thread and its spawned workers) using an async messaging technique:
// Send a message to the worker
aWorker.postMessage({ someData: data })

// Listen for a message from the worker
aWorker.once(“message”, response => {
  console.log(response);
});

Dealing with messaging can really make the code much harder to read and maintain. We were looking for something friendlier, where one thread could asynchronously “call a method” on another thread and just receive back the result.

We went on to explore and test various OS packages around thread management and communication.


Along the way we found some packages that solve both the threadpool problem and the elegant RPC-like task execution on threads. A popular example was piscina. It looked all nice and dandy, but there was one showstopper. 


A critical requirement for us was to have a way for our worker threads to call some APIs exposed back on the main thread. One major use-case for that was reporting business metrics and logs from code running in the worker. Due to the way these things are widely done in Wix, we couldn’t directly do them from each of the workers, and had to go through the main thread.

So we dropped these good looking packages and looked for some different approaches. We realized that we couldn’t just take something off the shelf and plug it in.

Finally, we have reached our final setup with which we were happy.

We mixed and wired the native Workers API with two great OS packages:

  • generic-pool (npmjs) – a very solid and popular pooling API. This one helped us get our desired thread-pool feel.
  • comlink (npmjs) – a popular package mostly known for RPC-like communication in the browser (with the long-existing JS Web Workers). It recently added support for Node.js Workers. This package made inter-thread communication in our code look much more concise and elegant.

The way it now looks is now along the lines of the following:

import * as genericPool from ‘generic-pool’;
import * as Comlink from ‘comlink’;
import nodeEndpoint from ‘comlink/dist/umd/node-adapter’;

export const createThreadPool = ({
                                    workerPath,
                                    workerOptions,
                                    poolOptions,
                                }): Pool<OurWorkerThread> => {
    return genericPool.createPool(
        {
            create: () => {
                const worker = new Worker(workerPath, workerOptions);
                Comlink.expose({
                    …diagnosticApis,
                }, nodeEndpoint(worker));

                return {
                    worker,
                    workerApi: Comlink.wrap<ModuleExecutionWorkerApi>(nodeEndpoint(worker)) };
            },
            destroy: ({ worker }: OurWorkerThread) => worker.terminate(),
        },
        poolOptions
    );
};


And the usage in the web-server level:

const workerResponse = await workerPool.use(({ workerApi }: OurWorkerThread) =>
    workerApi.executeModule({
        …executionParamsFrom(incomingRequest)
    })
);

// … Do stuff with response

One major takeaway from the development journey was that imposing worker-threads on some existing code is by no means straightforward. Logic objects (i.e. JS functions) cannot be passed back and forth between threads, and so, sometimes considerable refactoring is needed. Clear concrete pure-data-based APIs for communication between the main thread and the workers must be defined and the code should be adjusted accordingly.

Results:

The results were amazing: we cut the number of pods by over 70% and made the entire system more stable and resilient. A direct consequence was cutting much of Wix’s infra costs accordingly.

Some numbers:

  • Our initial goal was achieved – total SSRE pod count dropped by ~70%.
    Respectively, RPM per pod improved by 153%.
  • Better SLA and a more stable application – 
    • Response time p50 : -11% 
    • Response time p95: -20%
    • Error rate decreased even further: 10x better
  • Big cut in total direct SSRE compute cost: -21%

Lessons learned and future planning:

We’ve learned that Node.js is indeed suitable also for CPU-bound high-throughput services. We managed to reduce our infra management overhead, but this modest goal turned out to be overshadowed by much more substantial gains.

The introduction of multithreading into the SSRE platform has opened the way for follow-up research of optimizations and improvements:

  • Finding the optimal number of CPU cores per machine, possibly allowing for non-constant size thread-pools.
  • Refactor the application to make workers do work that’s as pure CPU as possible.
  • Research memory sharing between threads to avoid a lot of large object cloning.
  • Apply this solution to other major Node.js-based applications in Wix.