TNS
VOXPOP
Terraform or Bust?
Has your organization used or evaluated a Terraform alternative since new restrictions were placed on its licensing?
We have used Terraform, but are now piloting or using an open source alternative like OpenTofu.
0%
We never used Terraform, but have recently piloted or used alternatives.
0%
We don't use Terraform and don't plan to use or evaluate alternatives.
0%
We use Terraform and are satisfied with the results
0%
We are waiting to see what IBM will do with Terraform.
0%
Frontend Development / JavaScript / Software Development

Beyond Browsers: The Longterm Future of JavaScript Standards

How the scope of ECMAScript is changing as JavaScript matures; and where the opportunities for adding new features are coming from.
Jul 3rd, 2023 11:51am by
Featued image for: Beyond Browsers: The Longterm Future of JavaScript Standards
Image via Pixabay

It’s a testament to how important the web is that JavaScript, by many rankings the most popular and widely used language, has emerged from browsers and become useful far beyond its initial platform. But the ECMAScript standard for the language has continued to be driven primarily by the needs of browsers. Even in new environments, like serverside and embedded JavaScript runtimes, those runtimes are still using the JavaScript engines from browsers — though their requirements are often rather different.

The first feature that’s coming into the language from those serverside runtimes — async context — also turns out to be useful in browsers, and even for building some browser features. That might mean the focus for new JavaScript standards extending beyond the browsers that have tended to dominate the language’s evolution: at the very least, we can expect more collaboration and compatibility between serverside runtimes.

JavaScript Beyond the Browser

One of the more ambitious ideas for JavaScript is to standardize the specification not for JavaScript code but to fully parameterize what’s required to create a JavaScript runtime. That used to be something only browser makers did, and they already have their own bytecode specifications. But as serverside JavaScript becomes more common — and the definition of “server” much more varied, from edge and IoT devices to WebAssembly environments — more people are interested in having a JavaScript runtime that suits their own specific needs, which might be about creating the smallest possible runtime.

“Normally we pick a runtime like NodeJS or Deno or the browser and that runtime comes with certain execution assumptions, certain global objects,” explained Fastly’s Guy Bedford, who works on projects like Componentize-js for compiling JavaScript to WebAssembly. Being able to choose specific runtime characteristics, like what global objects are available and how you link them together, would be very helpful for flexibility and efficiency. “If you’re writing a little JavaScript plug-in system for security, you can decide ‘do I want to support npm packages that can work in that environment?’”

That’s a very complex and difficult problem, Bedford warned, because JavaScript developers (and the JavaScript code they write) expect various things to be available, so there needs to be a baseline of what will be supported by a runtime. “The long-term development of this project is fully parameterizing those decisions so that it is like building your own JavaScript runtime out of these primitives.”

These questions about what belongs in JavaScript runtimes both in and beyond the browser are one of the reasons the WinterCG (for Web Interopable Runtimes) community group at the W3C started last year: to focus on cross-runtime interoperability between browsers and non-browser runtimes, advocating for features important to serverside JavaScript environments that may not be as critical for browsers — which is all the charter for WHATWG covers.

Serverside JavaScript runtimes and platforms like Node, Deno, Bun, Cloudflare Workers, Netlify and Vercel are obviously different from browsers, but the Web Platform APIs that browsers have been adding for the last few years are often very relevant there as well, maintains James Snell, an engineer working on Cloudflare Workers and member of the NodeJS technical steering committee. “Yes, Node is a different environment, but you still parse URLs, you still deal with streams, there are still common things.”

In 2016, he implemented the standard URL class for NodeJS and the various runtimes now support more of these standard APIs. But the browser focus at WHATWG and, to some extent, the W3C has meant that the slightly different needs of serverside runtimes aren’t always a priority in API design — especially as the different runtimes all have different goals and requirements (for example, some have full file system access, others have no local file system at all).

That disconnect can lead to multiple non-browser runtimes creating different ad hoc solutions for the same functionality, which means developers need to use polyfills to make code run on multiple environments and may face some odd performance issues between different implementations of those standard APIs.

WinterCG isn’t a competing standards body: instead, it’s about coordination — identifying common needs and working through that in standards bodies or elsewhere. “It gives us a venue where it’s not these separate one-off conversations with WHATWG,” Snell explained. “We have a venue now to talk and discuss issues.”

“Getting these runtimes that are doing the same things to do them, in the same way, benefits everybody, whether it’s part of the language standard or not. It doesn’t need to be different API’s for parsing a URL, there doesn’t need to be a different API for sending an HTTP request: it’s a single, consistent way of doing that works, whether you’re in Node or Deno, Workers, Bun or wherever. That’s really where we’re trying to get to.”

This kind of (even unofficial) standardization gives organizations more confidence that they can depend on serverside JavaScript runtimes, which have sometimes been prone to rather erratic governance in the past. “A number of the folks that have wanted to get involved [with WinterCG] are not developing runtimes; they’re the companies that are using the runtimes. They’re saying, we want to build our business on top of this, but we need to make sure that we have continuity, that we have multiple options,” Snell said.

Standards that Suit Serverside

Developers don’t want to write their code multiple times to run on different serverside runtimes, which they have to do today. It slows down development, increases the maintenance load and may put developers off supporting more platforms like Workers, Snell noted. Library and framework creators in particular are unhappy with that extra work. “They don’t want to go through all that trouble and deal with these different development life cycles on these different runtimes with different schedules and having to maintain these different modules.”

That’s a disadvantage for the runtimes and platforms as well as for individual developers wanting to use tools like database drivers that are specifically written for one runtime or another, he pointed out. “We talk to developers who are creating Postgres drivers or MongoDB drivers and they don’t want to rewrite their code to fit our specific API. It would be great if it was an API that worked on all the different platforms.”

WinterCG is trying to coordinate what Ehrenberg calls “a version of fetch that makes sense on servers” (leaving out requirements like CORS that don’t apply to servers) as well as supporting Web APIs like text encoder and the set timeout APIs (which are part of the HTML and DOM specs rather than JavaScript) in a compatible way across runtimes. This isn’t creating a new spec, Snell noted: “We’re starting with the with actual fetch spec, and we’re going to line out the parts that don’t apply to servers.”

The idea of the server-specific subset of an API with consistent serverside implementations underpins a bigger WinterCG project. The Minimum Common Web Platform API list is starting fairly modestly by documenting which Web Platform APIs are already implemented in Node.js, Deno, and Cloudflare Workers. But it’s also a first step towards a comprehensive unified API surface that JavaScript developers should be able to rely on in all Web-interoperable runtimes.

Supporting a subset of what’s available in web browsers in a common way would make building new functionality more affordable and sustainable for both developers and runtime creators, Ehrenberg noted. “Everyone’s talking about React server components or SolidStart or SvelteKit; they all have this common base of some of the code runs in the server and you have an environment that is somewhat rich and has some web APIs.” The common API list would make an ideal base for building that system on, and alternatives would focus their energy on what they want to do differently.

Between this standard base and the API development, Palmer described WinterCG as “promoting universal JavaScript, increasing the set of JavaScript that can work both on in the client environment and on the server side, across multiple servers.”

Keeping Track of Distributed Context

A universal JavaScript would include features that didn’t start in the browser. “Sometimes there are small Web APIs that are not in browsers but could be, and have a lot of value for server environments but also have value for browsers, that haven’t been prioritized by browsers,” added Ehrenberg.

The first of those serverside ideas to become a TC39 proposal is async context, a way to simplify what Ehrenberg calls “code which follows other code” to trace what’s happening, by having a variable in common.

“[A user clicking] a button may lead to a fetch of information from the server, followed by processing the information in a worker, and finally writing it onto the screen. To measure how long such a chain of events takes (which is important for optimizing performance), you first have to know which chain you’re in: JavaScript, whether on the server or the client, can juggle multiple things at a time. AsyncContext is the perfect tool for this.”

The async context variable could hold like an OpenTelemetry span ID. The event listener that you use to track that button click fetches JSON and puts it in the DOM tree. “It would be really nice to know that that fetch took 10 seconds and so it’s very, very slow for the user. That’s all solved by OpenTelemetry”, explained Vercel’s TC39 delegate Justin Ridgewell, one of the champions of the proposal.

This kind of distributed tracing (or adding annotations to logs) is vital for performance monitoring and debugging, Snell pointed out. “There’s so much that can go wrong. Applications today are so complicated and involve so many different moving pieces; unless we have that telemetry there’s no way to know what’s actually happening. When you have a server that is serving millions of requests from different users, all within a single application, the ability to trace a request context through an entire flow while all these different things are happening concurrently is very important.”

The asynchronous functionality in JavaScript, like promises, is key to writing web applications but they complicate tracing. “The way that promises work in JavaScript hides so much detail. An asynchronous flow using promises fails and I know that a promise failed but I don’t know where the promise was created. When the promise is rejected, it only tells you where it was rejected. It doesn’t tell you exactly which promise it was and where that came from. We have to go through all these different hacks to try to identify where that promise was created.”

Async context propagates the context of what’s happening through multiple levels of code — for example, if you have framework-level code that needs to call on user-level code which then calls back into the framework-level code.

“Without an observability mechanism like async context, where you can say ‘here’s an additional piece of data that travels with this thing’ so you have this additional information it’s impossible,” Snell noted. “You can’t have that awareness, that observability, unless you can actually propagate that information.” The current options for doing that are “hacks with lots of holes and lots of ways they don’t work: we need to build it into the runtime”.

Node already has a similar feature, async local storage, which Vercel also implemented in the Next.js Edge Runtime. And there’s a polyfill for Deno — although implementing something that wasn’t a web platform standard API, because popular libraries need it, was slightly controversial for some of the runtimes, highlighting the need for a standards-based approach. Datadog also uses async local storage in their runtime.

But there can be architectural and performance issues with the approach in Node: using async local storage and the async hooks that it relies on in Node to track reading async files like HTTP requests using a side channel (because the JavaScript engine and language don’t support this directly) can result in a lot of overhead that will be avoided by bringing this approach into the language.

“Async context is a minimal subset of async local storage; it’s the core functionality implemented in a much less complicated way,” Snell said.

To help Cloudflare Workers implement async local storage for compatibility with Node (and so it could use it for internal tracing that makes it easier to support customers), the WinterCG group created a portable subset of the API. “This is the subset of async local storage that we can implement in the same way we would implement async context. It doesn’t have all the APIs Node has, the ones that we know will be compatible with async context.”

The async context proposal is designed to coexist with async local storage, Ridgewell told us. “There shouldn’t be any special ability that you get with one you can’t get with the other. This allows you to write the small wrapper class so that if you already have async local storage, you can write the interface for async context and then implement it without implementing the entire thing. You can reuse async local storage, or vice versa. If you already have async context, because it’s provided by the JavaScript API, then async local storage is a small wrapper implementation around async context.”

Ridgewell expects a massive performance increase from moving to async context from async local storage, because the implementation strategy will store the current request data for your concurrent request handling on your server. “That all gets much, much, much, much faster.”

Clientside Needs Context too

It turns out that serverside runtimes aren’t the only place developers need this functionality. As you might expect for distributed tracing, it’s useful on clients too, for collecting performance information or annotating logs.

Angular’s Zone.js is essentially the same thing as async local storage with a couple of extra APIs, only for clients, Ridgewell pointed out. “It’s the exact same need that is solved by async local storage arrays and async context. React is going to need async context in the future for async functions and async components. NextJs needs async context for its request servers. An OpenTelemetry library requires this and everything currently has to use Zone.js because it’s the only thing that will work on clients.”

But like the hacks for serverside runtimes, the polyfills that implement this have limitations that developers need to know about and work around. If you use the native async await API — native async functions that allow you to wait on a promise — you can’t instrument it. “It skips all possible monkey patching that you could do on a promise so it’s impossible to use Open Telemetry or Zone.js client-side implementations,” he explained. There are various ways to transpile native async await and use promises by monkey patching the Promise.prototype.then() method, but that’s more work for developers. “There are cases where the current client-side can’t work. You have to install a polyfill and that polyfill is flawed. If we can have a real API that covers all of this, then these client use cases are finally possible without these onerous difficulties we have to place on the user code to be aware of the limitations of our implementation.”

Async context will even be useful for features inside browsers like optimizing how resources on a web page are loaded by setting priorities and tracking what tasks depend on those resources with the new Prioritized Task Scheduling API or keeping track of long-running tasks that might block user input with the new Long Tasks API and let developers put those tasks in the background, so they won’t run until the computer is idle. Browser makers are very interested in tracking performance for that’s tricky with Single Page Applications where tasks create new URLs for the app so the browser back button still works: async context could do that.

“They need the exact same core primitive capability that async context provides,” Ridgewell noted: “these APIs could be implemented in terms of async context.”

Being useful to the browsers might speed up implementation of async context, which has been in discussion for several years and recently made very swift progress to Stage 1 and then Stage 2. “Future advancement will be a bit slower,” warned Ehrenberg. “We are working on a draft implementation in V8+Chromium and specification updates on both the JS and HTML side, before proposing Stage 3.”

Getting the agreement that TC39 wants to go ahead with the proposal and a concrete spec that explains how it can be implemented that are required for Stage 3 might happen this year. But getting the implementation and signoff required for Stage 4 will take longer because to be useful, async context needs to integrate with everything that creates scheduling events — and it needs to be on by default. TC39 can specify how async context will work with promises, because they’re part of the JavaScript specification, but it also needs to work with the set timeout and set interval APIs that are part of the HTML and DOM specs, as well as event listeners, the fetch API and many more. “You have to be able to propagate that async context through a set timeout; when that timer fires later on you have to be able to restore the proper context,” Snell noted, suggesting that the proposal won’t reach Stage 4 until at least next year and maybe later.

That means WinterCG, TC39 and WHATWG are all key to moving this forward. “We have this entire ecosystem that we need to make aware of this new proposal and get them to adopt it so that we can go from Stage 3 to Stage 4 with a useful set of features,” Ridgewell noted.

But there’s also a lot of motivation for everyone in the ecosystem to pitch in. As soon as Cloudflare Workers enabled their implementation of async context for Vercel, “they were able to replace their hacky workaround that was actually causing us problems internally,” Snell told us. “Enabling the API made things much more reliable.”

Another Approach to Runtime Standards

At this point, it’s not clear whether async context will be a one-off or something that ushers in a new path for features to arrive in JavaScript, because it’s unusual in several ways, Snell noted. “It can be in the web platform or it can be language level: that’s unique about it. It’s one of the first that has a significant ecosystem implementation already in the form of async local storage. It does make for a really interesting example case: if it works it could open the door for maybe standardizing things that are coming more from the ecosystem rather than from the language or the runtime.”

But async context may prove to be a rare example of an API coming from the serverside JavaScript community, Ridgewell suggested, both because the major implementation work done by the browser makers will always have a lot of influence on how the language advances, and because a universal JavaScript would also cover even more environments than browsers and servers.

“I don’t think we’ll see more server-only use cases added to the language as a whole. There are a lot of competing ideas about how we want to evolve the language and one of the things that we’re concerned with is that the TC 39 specification should be useful for all platforms that implement JavaScript. There’s server obviously, there are clients, obviously — but there are also micro embedded chips that run JavaScript, there are edge functions that are running JavaScript that are not the classic server environment.”

“If the use case is only applicable to servers, I think we’re going to have a really difficult time getting committee consensus on adding a feature, but if we can show that it applies to server and another in another environment, it becomes a little bit easier because now we have to coordinate how this thing works across the environments.”

For features that don’t get picked up by WHATWG, W3C or TC39, WinterCG can be “the venue where we could still collaborate and push forward on common APIs,” Snell suggested. “Bun hasn’t been too involved in the conversations yet, but we want them to be more involved.”

His hope is that multiple runtimes will implement the fetch subset so that the serverside fetch implementations all work in a consistent way and that WinterCG can also help standardize a Connect API for outbound TCP sockets. “Node has its net and tls APIs and they’re very specific and they’re very complicated. Deno has its own API, Deno.connect. Bun is doing some stuff where they’re implementing those APIs, but they also have their own Connect.” Workers has its own connect() API that attempts to simplify this and will contribute this to WinterCG.

“What I’m hoping is we can keep this momentum going. Let’s use WinterCG. We’re going to bring our Connect API to WinterCG and say let’s standardize a TCP API, so we can do this in a consistent way.”

“We can’t underestimate how important it is to the stability of the ecosystem moving forward; it’s not maintainable that we ask everybody to adapt to each individual runtime, especially as the number of runtimes continues to grow. Whether it’s part of the language or not, having these standards is important. Having support for these things, implementing them in a consistent way, it’s something we as an ecosystem have to embrace.”

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Deno.
TNS DAILY NEWSLETTER Receive a free roundup of the most recent TNS articles in your inbox each day.