proposal-built-in-modules
proposal-built-in-modules copied to clipboard
Merging Web APIs into JavaScript
When we think about "Web APIs" most people think about the DOM. But there are many Web APIs that are typically found in the standard libraries of languages. For example, it would make sense that there be a single core "crypto" library rather than two: (WebCrypto & NodeJS's built-in "crypto" package.
Crypto might not be the perfect example of a Web API that we could consider merging into JavaScript's stdlib. But it does often surprise people just how much of JavaScript comes from the Web. For example, setTimeout and setInterval are not part of JavaScript itself and they actually behave differently inside of Node than in browsers.
It's clear that more and more people expect to be able to share JavaScript in many different environments. JavaScript developers expect to be able to write code that works both in NodeJS and the Browser, not to mention all the other environments JS runs in engines specially made for IoT devices. Library authors have a lot of trouble making their code run anywhere and will often include handwritten ports of Node libraries to the browser which add many KBs of code that needs to be downloaded, parsed, and run.
I think it would make sense for TC39 to work with Web API standards bodies in order to merge some of them into JavaScript itself.
Most Web APIs have been specified taking in consideration only the needs of the Browser vendors. The semantics of most of some of those could not be matched 1-1 on non-browser environments, and as such they are extremely hard to port. From past experiences, it is hard to push Web APIs into Node.js: moving them into the language itself is only to cause more friction and problems for the community.
I agree that we need better interoperability, but the chasm is too big to cross. I think what we really need is a clear way to differentiate between language and web libraries, and communicate that to our users. There are going to be differences, because the interaction model is very different, so we should be transparent. Then, provide polyfills to cover those gaps.
@annevk and I have talked about this, and he convinced me that we shouldn't start out this way.
There are a couple practical issues:
- It is really hard to get a bunch of experts and stakeholders together to collaborate on a standard. Each standards committee has its own unique culture, which seems to work for the people involved to some extent (if it's shipping across browsers, they must have done something right!). Asking efforts to move to a different venue is a rupture in that productive culture, and could inhibit further progress. We may lose some valuable participants, or lose effective decision-making processes that work for that group.
- Historically, when TC39 takes on a shipping web specification, it asks for changes. Partly this happens due to different notational conventions used by different specifications, and partly it's due to a different weighing of priorities and requirements. It is always risky, with respect to web compatibility and convergence across browsers, to ask for changes to existing APIs; we saw TypedArrays take several years before reaching spec compliance, and it's not clear that this effort led to benefits for programmers. If TC39 takes on existing specifications, I'd ask that we start with them as they are, and consider changes after it is adopted as part of the ECMAScript standard.
I think it would be best to start with a collaboration among existing standards bodies, leaving specifications where they are, but documenting and ultimately defining what works across different places JavaScript is used. This is what I am trying to get started in https://github.com/littledan/js-shared-interfaces .
I think OP's point is valid though and it is something we're trying to be better at. Namely, ensuring input from Node.js folks on APIs that make sense to expose outside the context of browsers is taken into account. And there's somewhat active interest in working together on IDL with the long term goal of harmonizing how we define APIs across standards bodies.
So yeah, perhaps not so much venue-swap, but definitely collaboration and harmonization.
we saw TypedArrays take several years before reaching spec compliance, and it's not clear that this effort led to benefits for programmers
I can tell you that this effort did lead to my benefit and I'm a programmer. I wrote an Electron app that works with binary data and TypedArray, ArrayBuffer, etc were all vital to converting this data between frontend and backend.
TypedArray is a perfect example of a great API that works well in the browser and in Node.js (and even in Electron) 😄
Making TypedArrays available across environments was definitely useful. Making them throw rather than return undefined when accessed after detaching them--not sure.
Why does the process have to be pulling web specs into TC39's process in order to be considered part of the JavaScript "standard library"? Why can't TC39 participate in other standards bodies and help design APIs to work for JavaScript as a whole? JavaScript Web APIs aren't all standardized by one body or one process anyways. There's already a lot of cross pollination between them. I would like to see us double down on working with Web standards and adopt them across communities
@jamiebuilds I am in complete agreement. What do you think of https://github.com/littledan/js-shared-interfaces as a way to work towards that?
Given that this is a TC39 repository, how would you feel about following up about standardizing some kind of "JavaScript standard base" in this issue: https://github.com/littledan/js-shared-interfaces/issues/11
As a user, I like the idea of bringing some of the bigger Web APIs to the language itself, like setTimeout/setImmediate/etc. and binary streams, preferably in a form simpler than the WHATWG variant. However, I feel crypto should remain predominantly a host thing. I'd be okay with exposing the basic shell interface, but it should be entirely up to the host what primitives to expose and use, if any.
I also feel like the OP. Putting this to the language makes the javascript environment less portable and implement and run interpreters in different scenarios (for example a microcontroller that suddenly needs to support this spec cause every second package needs some std: stuff and also has to provide the storage for the stdlib and whatever that stdlib needs e.g. kv-storage capability)
Its a browser problem, a browser feature and therefore it should be a browser api. The fact that a library is already in the browser and doesn't needs download is an odd argument to me, cause with caches, public cdns its already possible to prevent that loading if necessary.
Also it makes my application bundling more complicated again. Cause to support older Browsers I would need to handle two entry points for my application (module and non module with polyfills). This imo could probably result in a worse experience in the end, cause with dependencies of my app I eventually have an app one day that requires not only kv-storage but also foo-storage, bar-storage, nextbigthing-storage (and maybe in 5 versions each), Which might be no problem for the newest shiny browser but the bundle size for old-browser explodes.
So I don't really get why this can't be fixed in the browser api's itself. Or why the browser can't provide a reliably and fast storage solution without a std lib.
Last but not least, who defines what is coming into that stdlib? Browser vendors? Big players?
for example a microcontroller that suddenly needs to support this spec cause every second package needs some std: stuff and also has to provide the storage for the stdlib and whatever that stdlib needs e.g. kv-storage capability
kv-storage may not be needed in microcontrollers, but i think most JSON/string-related apis would be helpful ([decode/encode]URIComponent, Text[Decoder/Encoder], possibly Url class, etc...).
the use-case for wanting js in embedded-devices is not general-purpose programming. its typically for either:
- manipulating JSON data or
- interfacing with the web (requiring url-parsing capabilities).
both of which ultimately are meant to achieve some bigger UX-workflow objective.
One of the things that makes Javascript a ubiquitous language is that there is a big ecosystem of packages and a very small language core. "Easy" to implement.
If there is now coming an implicit dependency to support the described module mechanism in the respective JS runtime and also the ecosystem is starting to adopt modules, which are then assumed to be available always (must not be kv-storage, can be others as you already mentioned) it can become harder to provide the js runtime. Cause in this case it must be ensured that the module mechanism and maybe a "baseline" of modules are available in the target environment. Otherwise a developer would need to watch out that all the dependencies and transitive dependencies don't depend on these, cause her environment may not support those.
And by the way, I use javascript on microcontrollers also to control sensors and actors. So please let us not put in concrete use cases here, cause we won't catch them all anyways.
@tobmaster I would expect that JS runtimes targeting those would at least let you compile out what you don't use. At the very least, Moddable XS lets you do that for both globals and built-in modules it provides.
Also, it's worth noting that kv-storage is being handled by WHATWG with the intent it could become part of their HTML family of standards, mostly independent of TC39. I'm not affiliated with either group, but I'm pretty sure I've seen affirmative confirmation by both groups stating it's not going to become part of ECMAScript itself. (It's a similar concept to Node's native modules like "util" and "events".)
One of the things that makes Javascript a ubiquitous language is that there is a big ecosystem of packages and a very small language core.
@tobmaster I'd say JavaScript is too small than a language should be, even console.log() is not part of JavaScript, but of Web API. I guess most develop would like to use console.log() regardless what the platform is.
@trotyl Most who aren't developing for embedded, that is. On embedded, you might not even have a console to print to, so it doesn't always make sense to have a console.log.
@isiahmeadows ok how do I compile out kv-storage out of an npm module that depends on it? I honestly cannot see that can work.
Also we have two topics here that are already mixed up completely, kv-storage and that "standard library" loading mechanism. And while I have nothing against kv-storage itself, I don't understand why it cannot become a browser api like indexdb, localstorage and co... I am also not affiliated with those groups. I just care cause JavaScript is more than just a browser language. And indeed the module loading is part of ecmascript itself. So if someone interferes with that it has influence on all code.
@trotyl A small language core is still preferable in my eyes. Console log is also a browser api and supported easily or omitted. It doesn't belong to the standard youre right. But thats why you can see its already a problem cause everyone uses it for stuff. And thats a thing that would elsewise be missing sure. But its just output. Other modules are more demanding.
On embedded I can provide console.log and route it somewhere (tty or whatever) or I can just accept it and discard. Why I don't think console is a problem I already described.
@tobmaster
ok how do I compile out kv-storage out of an npm module that depends on it? I honestly cannot see that can work.
Using tools like Rollup, maybe? If it's using std:kv-storage, it's already declaring a DOM dependency, so you probably shouldn't be using it in other places.
Also we have two topics here that are already mixed up completely, kv-storage and that "standard library" loading mechanism.
That's not the intent. Look at the second paragraph of this comment - the intent is that it should be a Web API, just exposed in the form of an ES module rather than a global.
@tobmaster @trotyl Just to reiterate, the things that were seriously be suggested being moved into core in the original comment are things like AbortController/AbortSignal (which was really first discussed in TC39 circles) and crypto/Window.crypto, and we do already have a big thing that came out of web APIs into the standard. But not everyone agrees the effort is a good idea - the next two comments show a heavy level of skepticism around doing it broadly. The goal here is just to see what from Web APIs are broadly useful enough to merit being extracted and formed into standard library modules.
Most who aren't developing for embedded, that is. On embedded, you might not even have a console to print to, so it doesn't always make sense to have a
console.log.
@isiahmeadows For those devices who don't have console, they can never have debugger as well. But debugger is still a built-in statement, which becomes no-op if no applicable. I think it's (for this specific case) more design issue than capability issue.
the scoped of builtins could be answered by this simple question: "why as an embedded-dev do i want to embed javascript?"
i suspect valid reasons are (please correct me):
- because i want device to interface net/web/intra-web
- because i want device to manipulate JSON-data
if not any-of-above, then you made a mistake choosing javascript.
for use-case 1, you need a js-engine with same builtin-capabilities as nodejs (and likely sqlite3 as well). this trivially includes console.log.
for use-case 2, you don't need a "standard" js-engine. a non-conformant, stripped-down es5-engine like duktape is perfectly fine for embedded JSON-manipulation.
Another valid reason to choose javascript is “for no other reason than i wanted to”, and that’s something we support too. (Not to mention the many many use cases that have zero to do with the web or with json, like robots or blockchain or wearables etc)
Not to mention the many many use cases that have zero to do with the web or with json, like robots or blockchain or wearables etc
@ljharb, can you elaborate on the robots, blockchain, wearable use-cases that don't deal with net/web or json?
Robots: http://johnny-five.io
@nicolo-ribaudo, johnny-five's stated-purpose is "Provides a powerful foundation for IoT projects", which is net/web [ux-workflow]. it doesn't make sense to use "pure" js-engine in johnny-five that lacks web-api capabilities.
It doesn’t make sense to you. It makes perfect sense to others. Please accept that and stop trying to claim that “web” or “json” or “UI” are the exclusive use cases for JavaScript - they are not.
agree to disagree. ux-workflow is the dominant-problem faced by IT industry today. javascript, robots, blockchains, wearables are simply tools/means for achieving ux-workflow solutions. and javascript is popular because its the most efficient language for solving ux-workflow problems.
tc39's effort to decouple javascript from web-apis essential for solving ux-workflow problems is misguided.
If that’s true, then literally any feature added to javascript is eventually intending to solve a UX workflow problem, so either way your argument doesn’t decrease the value or motivation of any feature.
@kaizhu256
johnny-five's stated-purpose is "Provides a powerful foundation for IoT projects"
That's marketing copy from a Sparkfun product that happens to be based on Johnny-Five, not Johnny-Five's "stated purpose".
@kaizhu256 Just for reference reasons. If I have such a device https://tessel.io/ or this https://www.neonious.com JS is not only web, json, web-api. Thats the point from which I am arguing. I hope its more clear to you now.
@tobmaster, those robot-kits have standard wifi/ethernet interfaces. nobody would be interested in javascript for robots without web-interface capability.