WRT PR #3 - ES module detection #13

Closed
bmeck opened this Issue Mar 9, 2016 · 291 comments

Comments

Projects
None yet
Collaborator

bmeck commented Mar 9, 2016

This is the place to discuss specifics of how node will determine the mode a given a source (file / source string). Source string examples include stdin or -e, they do not include JS Function or eval calls. The proposal itself can be seen in PR #3.

Discussions here should regard:

  • Ways to detect the module target for a specific source
  • Impact of detecting the module on other environments
    • How other environments will provide the detection in the case of interoperability
  • Potential incompatibilities introduces for existing workflows with module detection possibilities.
  • Transition goals for a time when ES modules are the normal target.

Note: CJS will be the default target of node for the foreseeable future, "normal" here means what developers write daily. This is a constraint that can be easily worked around for a single source via a cli flag to note that the source is an ES module. This flag should not be seen as relevant to the discussion in this thread.

Note: We are only discussing choices that do not require parsing of the source file.

It should not discuss import/export conversion, evaluation ordering, or module path resolution.

Is it appropriate to discuss the ".jsm" extension proposal here? Or does it belong elsewhere? Or has that proposal been definitively rejected?

Collaborator

bmeck commented Mar 10, 2016

@jokeyrhyme yes, you can discuss various approaches such as .jsm, package.json, etc. here. .jsm is not rejected, and many still favor it.

I've tried to do a recap of the options for this at https://github.com/nodejs/node/wiki/ES6-Module-Detection-in-Node

Personally, I'm proceeding with the userland solution of editions - with the current implementation of editions, there is import and require syntaxes - https://github.com/bevry/editions/wiki/Syntaxes - that could be used for such detection, a directory property could also be added to each edition, if we go that route.


@bmeck thanks for splitting this up

Collaborator

bmeck commented Mar 10, 2016

@balupton lots of those are in the proposal / please avoid putting things that have not been agreed upon in the node wiki as it may confuse people. Feel free to paste all that in here and we will discuss it. I disagree with a lot of pro/con points and discussion should be had on them.

@bmeck point of the wiki page was to summarise all points, not just those agreed upon, so as to avoid having to read the entire previous thread for such depth, it should be expanded rather than truncated - discussion should happen here yes, however I see no wrongdoing with maintaining a comprehensive up to date concise summary of all points

jmm commented Mar 10, 2016

I understand that there are a lot of objections to the idea of trying to inspect the source to detect the format, to the point that in the original thread people keep saying it's a dead issue, off the table. But yet it's still mentioned here, so as long as it's still in the mix I think the export {} idea I mentioned should be in the picture.

Something like a "use module" pragma would impose an authoring tax on every file, even though most ES6 modules will already organically include at least one import|export that unambiguously establishes its type. That'd leave probably just a tiny fraction of modules, like those that are only for side-effects, that require authors to opt-in with some explicit indicator of type. Instead of something invented like "use module" that could just be module syntax: export {}. @balupton Per your wiki, with this suggestion in mind I think the implementation side is the huge obstacle, not the authoring side.

Collaborator

bmeck commented Mar 10, 2016

@jmm @balupton updated the wiki, it was missing a fair amount of information. Noted things that caused rejections like the pragma "use module". Also linked to proposal to answer "still unclear" points. Please read the file in the proposal before making new statements.

Collaborator

bmeck commented Mar 10, 2016

warning, in 2 weeks module proposal for node will be taking file extension unless package.json complexity and workflow concerns are addressed. If those concerns are addressed we will happily delay the decision. I will be confirming against the @nodejs/ctc that this is not a problem in this time as well.

Which one? Or is that still to be determined?

On Thu, Mar 10, 2016, 14:14 Bradley Meck notifications@github.com wrote:

warning, in 2 weeks module proposal for node will be taking file extension
unless package.json complexity and workflow concerns are addressed. If
those concerns are addressed we will happily delay the decision. I will be
confirming against the @nodejs/ctc
https://github.com/orgs/nodejs/teams/ctc that this is not a problem in
this time as well.


Reply to this email directly or view it on GitHub
#13 (comment).

Collaborator

bmeck commented Mar 10, 2016

@isiahmeadows https://github.com/nodejs/node/wiki/ES6-Module-Detection-in-Node#option-4-meta-in-packagejson

Notably:

  • Node allows requiring files that are not part of packages
  • Will persist in perpetuity and have hints of previous Script goal
    • mitigated but present
  • Causes tooling to need to perform JSON parsing per module resolution
    • speed mitigated by cache, parser requirement not
  • File semantics are no longer self contained to the file

As per the options, they differ for various ways to put things in package.json

Collaborator

bmeck commented Mar 10, 2016

It should be noted that the file extension has a sizable impact as well. It is not with a happy heart I choose between either.

Also, the package.json thing would be confined to node, and we have several people wanting a simple way to determine goal for non-node projects (many of which do not have package.json).

@bmeck "Is there any plans to support files that use both module systems (aka CJS+ES Modules)?" is specifically about a file that supports both, rather than a package that supports both, here is a contrived example:

import a from 'a'
const b = require('b')

export b
module.exports = a

Another phrasing would be, does each file have to be exclusive es modules, or exclusively node modules, and never mix and match in a single file.

This arose once or twice in the other thread, but I couldn't find anything clear about it, hence why it is there, doesn't seem raised in the proposal explicitly - perhaps is implied somewhere and I didn't notice.

Currently with the babel compiling, mixing and matching module systems in individual files is already being done.


great work with the wiki updates! 💃

Collaborator

bmeck commented Mar 10, 2016

@balupton that is not supported, @kittens tried that rabbit hole in babel, it is pure zalgo. If you are in the Module goal there is an exports object as specified in the proposal, but it is a ModuleNamespace. require continues to exist in the Module goal as well. Is there a reason you want a mixed goal in particular?

@bmeck

Is there a reason you want a mixed goal in particular?

Just raised myself because I know of babel compiled projects that do such mix and matching.

I've found myself having such quarrels too, specially with optionally required dependencies (a dep may only need to be required under special circumstances, e.g. for a specific function call which in the majority of runs will never be called), or with dynamic dependencies (e.g. require(somevar)) - both currently work beautifully with require.

I subjectively fear losing the js extension forever. It's iconic. There’s a joke that if you pick a noun there’s going to be a library with a JS appended. If we adopt the extension proposal do we also have to extend the joke to JSM? Some of the technical implications of the extension proposal really scare me, but what about the cultural implications? What about the new developer implications? File extensions are an aesthetic and losing the js aesthetic worries me.

I think @jmm’s suggestion of export {} is a really good idea and I agree with him in that parsing should be discussed more. Given their are negative impacts, but the negative impacts of an extension and package.json approaches are much worse.

A parsing solution does not impact in place systems. In place systems can happily assume CommonJS. An extension approach, however, widely impacts multiple software ecosystems. Whereas a parsing solution only effects new code.

How about file.jsm.js or file.esmodule.js - would offer same benefits of extension option without impacting ecosystems as abruptly

Collaborator

bmeck commented Mar 10, 2016

@balupton as discussed in the original PR thread, that is not how many toolchains work, they only read the final extension (this includes node itself).

@calebmer can you discuss the technical implications. There are many problems with parsing, and polyfills are a very common instance of side-effect only modules.

bmeck referenced this issue in nodejs/node Mar 10, 2016

Closed

Vote on ES module detection #5648

ljharb commented Mar 10, 2016

@calebmer in-place systems being unable to use ES6 modules is a massive impact.

@bmeck ok, and I'll address the side-effect only concern and the other CONs listed in the wiki. But first I'd like to offer three observations:

First, we have to remember that no solution can be ideal. The ideal scenario would be that the ES module specification was around at node's conception so that modules could be supported from the start. That didn't happen, so we now have the tricky task of picking a way to add module support to node whilst not breaking the ecosystem. The parsing solution mirrors most closely what node would look like if there was module support from the start.

Second, theoretically the parsing could be as simple as a regular expression matching export statements. Yes it loses the nuance of a full parser, but a regular expression would be an optimization which could solve for costs on large files and costs for performance critical requires. This also solves for a few other CONs listed on the wiki such as implementation complexity and toolchain detection.

Third, the extension, package.json, or any other approach may be implemented in parallel as an optimization to alleviate some of the costs. If the extension approach were also implemented I'd recommend, .jsmodule and .jsscript to toggle the parsing mode.

Now to counter arguments.

Side-effect only modules.

Most of the time a side-effect only module could easily be a script (optionally with a use strict pragma). If for some reason it has to be a module then, as @jmm suggests, the module could add an export {} to the file. Then node could identify the file as a module and treat it accordingly.

The intersection of side-effect only files and files which must be a module is very tiny, and the tax of a single line is insignificant and comparable to use strict pragmas of the past.

Toolchains require a parser.

Many tools which need to know the difference between a script and a module already have a parser, furthermore many more tools don't even need to tell the difference. For the select few tools which need to know the difference and can't use a small parsing solution from NPM, these tools can use out of band configuration. Take for instance the package.json proposals, these proposals fit well when it comes to specialized tooling.

Furthermore, tools that can't have a simple parser for NPM often aren't in the node ecosystem (I'm thinking of Ruby, Python, or other ecosystem tools). These tools are dealing with client code, not node code.

Large files.

Considering Babel and other parsers currently can work with large files I'm not entirely sure why this is a large concern. If we use a regular expression parser and a parallel file extension like I recommend these pains can also be alleviated.

Solutions may also include:

  • Limiting file size to an outrageous number.
  • Defaulting to CJS if no export statement was found in X characters (this allows the placing of an export {} at the top of a file to coerce the parser).

Performance.

In a normal project, all dependencies will be imported/required at startup time which is less critical for performance. However, in some scenarios this is not the case. In areas where dynamic requiring must be performant we can look into various algorithm coercing approaches. For current packages we can also look at the package.json, if it specifies a node version before ES modules we can default to the script goal.

@ljharb sorry for any miscommunication. I meant in-place systems won't be negatively impacted by a parsing solution, a parsing solution would be neutral. I was referring to the .htaccess and similar configurations argument against the extension approach.

ljharb commented Mar 10, 2016

@calebmer one common use case where a parser won't work is Airbnb's rails app - which uses Sprockets, which sends JS files to an entirely different box for compilation. It will have to know what is an ES6 module or not so it can send the right metadata to the compilation service - but it has no JS engine to do so. Also, perhaps you haven't tried reading JSON in POSIX, but "not needing a parser" is pretty critical in some places :-)

@ljharb A couple questions as I don't entirely understand the Airbnb-rails-sprockets-node relationship. Why can't Sprockets choose to implement its own format (maybe extension)? Why can't the compilation service make the detection? If the detection of modules were as simple as a regular expression would this alleviate concerns?

It seems to me that services, like Sprockets, have a little more freedom then node to make large breaking changes. If Sprockets wanted to assume everything was a module or everything was a script it could.

ljharb commented Mar 10, 2016

@calebmer the problem is that it doesn't want to assume that - we need to be able to gradually migrate our codebase from script to module. We also don't want Sprockets to implement something that wouldn't work with npm test, or all our node-based tools, for example.

Do we need to know which mode we are in prior to parsing?
Which mode-sensitive use cases do not ultimately involve parsing?
Couldn't the airbnb-rails-sprockets-node use case be solved by simply not updating the version of Node.js used until the rest of the required changes (if any) have been made?

jmm commented Mar 10, 2016

I don't have time to respond to everything here right now, but regular expressions are not adequate for detecting this, though like you said a regex or substring search could be useful as a quicker preliminary check to see if import|export is in the source at all before doing an expensive parse, like detective does.

Defaulting to CJS if no export statement was found in X characters (this allows the placing of an export {} at the top of a file to coerce the parser).

I think it's possible that placement in the source could be used to optimize this detection, as I mentioned in my original post:

especially if people are encouraged to place the code early in the source

But making it dependent on that would probably be too fragile, though I did try something like that with browserify for recognizing its own bundles. (That's a much more limited scenario though since it's designed to check its own output.)

jmm commented Mar 10, 2016

@ljharb

We also don't want Sprockets to implement something that wouldn't work with npm test

What would be an example of that? (Probably makes no difference to the issue at hand, I'm just curious what you mean.)

@jokeyrhyme Yes, because they're different goal symbols and modules are implicitly strict. People might have expanded more in the original thread.

ljharb commented Mar 10, 2016

@jmm just responding to the question "Why can't Sprockets choose to implement its own format" - the answer being, because one-off snowflakes are not ideal for being cohesive with the rest of the ecosystem/toolchain.

@balupton as discussed in the original PR thread, that is not how many toolchains work, they only read the final extension (this includes node itself).

Isn't that exactly why file.jsm.js is awesome? It is exactly because everything (with few exceptions) just cares about the final extension, making it that for everything it is just business as usual - unless of course they are aware of the .jsm.js convention, in which case they can specifically opt-in to the special handling of it, which is optional and up to them, without forcing anything - which seems to be exactly the point.

Consider the impact of file.jsm:

  1. Developer opts in to using it by specifying it directly
  2. Developer must update his development toolchain configuration to understand that .jsm is a JavaScript file, e.g. Atom.io syntax highlighting, configuring build toolchain to now use .jsm files, likely with several pull requests to tools in order to do so
  3. Consumers may also be impacted due to require('package/something.js') calls.
  4. Other possibly unknown "js" extension coupling side effects are also present, as what @calebmer describes here #13 (comment)

Consider the impact of file.jsm.js

  1. Developer opts in to using it by specifying it directly
  2. All of the developers existing tooling works exactly the same, no troubles at all
  3. Consumers may also be impacted due to require('package/something.js') calls
  4. Other possibly unknown "js" extension coupling side effects are not present, as "js" extension is still used

The only toolchain that needs to be aware of the file using ES Modules in this use case in this scope, is node... Using file.jsm makes everything abruptly aware of that, even when they have no need to, even with unintended consequences - breaking syntax highlighting, requiring .htaccess files to change, etc. Using file.jsm.js means business as usual for everything, except node. That to me is very powerful.

It seems forgotten that outside of node people already use ES Modules successfully with the .js extension, our solution should not have to impose on their existing success and conventions, the .jsm.js minimises the impact of affecting everything that already works well, with cost impact as minimal as needed, to achieve all the benefits of node needing to know.

Happy to be linked to the places this has already been discussed in case I have missed something. But it seems it solves all of @calebmer's objections to the jsm extension in #13 (comment) and works well with his points here:

Many tools which need to know the difference between a script and a module already have a parser, furthermore many more tools don't even need to tell the difference. For the select few tools which need to know the difference and can't use a small parsing solution from NPM, these tools can use out of band configuration. Take for instance the package.json proposals, these proposals fit well when it comes to specialized tooling.

Furthermore, tools that can't have a simple parser for NPM often aren't in the node ecosystem (I'm thinking of Ruby, Python, or other ecosystem tools). These tools are dealing with client code, not node code.

Given the sprockets argument... they can just adopt the .jsm.js detection and be done with it, same cost as the .jsm extension, less cost as implementing a parser or package.json sniffer it seems.

Again, the beauty of .jsm.js is everything would just be business as usual, everything would continue to work as is, without breaking anything, and things that do not yet have a detection method and actually do care about it, can just opt in to the .jsm.js detection, or a parser algorithm, or whatever they decide is actually best - without node forcing anything on them as the .jsm extension does.

@ljharb why can't babel be used until a reasonable amount of the codebase has been converted? Also a comment like // @module could solve this problem. It can signal to Sprockets a different build mode and doesn't break npm test.

I'm not convinced that there is a big enough need in tooling to know the difference between a script/module and further I'm not sure the "one-off" snowflake detection is a bad thing considering most build tools already have a snowflake configuration format.

@balupton

Isn't that exactly why file.jsm.js is awesome? It is exactly because everything (with few exceptions) just cares about the final extension, making it that for everything it is just business as usual

That's actually a really cool point, but it still has negative precedent setting impacts. What happens when developers start writing file.jsm.jsx, or other people start adding their own sub-extensions file.jsm.a.b.c.js? Does node care about the ordering? Would file.a.b.jsm.c.jsx still parse as a module? What about a jsm.js file?

What happens when developers start writing file.jsm.jsx, or other people start adding their own sub-extensions file.jsm.a.b.c.js? Does node care about the ordering? Would file.a.b.jsm.c.jsx still parse as a module? What about a jsm.js file?

Good points. Possible ways they could be addressed.

  1. Detection could either:
    1. Care for the .jsm extension anywhere, e.g. filename.split('.').indexOf('jsm') > 0
      1. This supports things like file.jsm.jsx and file.jsm
      2. Let's call this the .jsm decorator proposal
    2. Care for only the .jsm.js extension, e.g. filename.substr(-7) === '.jsm.js'
      1. This rules out things like file.jsm.jsx and file.jsm
      2. Let's call this the .jsm.js extension proposal
  2. Loading require('./index') could either:
    1. Do for ( const extension in require.extensions ) { /* check if "${path}.jsm${extension}" exists, otherwise check if "${path}${extension}" exists }
      1. However most extensions will not benefit from this, e.g. .jsm.json and .jsm.coffee, so seems absurd for the cost it introduces
    2. Add .jsm and .jsm.js to highest preference in require.extensions
      1. If .jsm.jsx also wishes to be a default, it could also add .jsm.jsx to require.extensions as easily as require.extensions['.jsm.jsx'] = require.extensions['.jsx'] - however, the need for custom extensions and jsm coupling here seems quite the exception

Point 1.i is nice as it can do a "business as normal" approach to custom extensions like jsx too, something impossible with the .jsm proposal. Point 2.ii is nice and simple compared to 2.i, while still allowing custom extensions the ability to opt-in to default loading, something impossible with the .jsm proposal.

@calebmer

//@module seems a lot like the use module; pragma discussed elsewhere. Either option may still be necessary to clarify weird edge cases, such as files with unclear modes. (ruled out below)

@balupton

My own experience with the .jsx extension is that libraries and tools I used were updated to support it very quickly. Some of the slower moving libraries and tools were simply abandoned. Surely we could expect this with a .jsm final extension, too, without needing .jsm.js.

Regarding sprockets, if there is demand for .jsm then I'm sure it'll be updated quickly.

Regarding .htaccess and other HTTP server MIME type settings, etc, I imagine we'll still be deploying ES5 code to production for a long time yet, even with HTTP/2. So surely we have enough lead time for the necessary server software to be updated.

Assuming multiple proposals for mode-flagging are not incompatible, why don't we just do everything and see which ones the community embraces? Surely we could enhance Node.js to do the following:

  1. if .jsm extension, treat as a module
  2. if //@module or 'use module'; pragma, treat as a module (ruled out below)
  3. if some other mode-flag experiment is detected, treat as a module
  4. out-of-band configuration (package.json, node CLI flags, etc)
  5. else, treat as a script

If it's hard to get consensus around a single approach, is it possible to implement the top 2 or top 3?

caridy commented Mar 11, 2016

guys, please, stop thinking about the pragma detection. we have exhausted that conversation via different channels, the number one goal of ES Modules is portability, and not all environments has the ability to analyze the code before it is parsed and evaluated like node can, and implements will NOT support a parser that mixes modules and sloppy mode scripts. please, stop it, and focus on realistic solutions.

Collaborator

bmeck commented Mar 11, 2016

Generally file extensions are contained to 3 letters ( https://en.wikipedia.org/wiki/8.3_filename ) and do not include a . inside of them. I did some scraping and could only find 1 other IANA file extension that includes an internal . : .1905.1 . So apparently it is possible to register one, but it would be a very odd duck and lead people to think .jsm.js is using some kind of transform just like .tar.gz is a tar file that has been gzipped. Will need to think on this, it doesn't look pretty, but may be viable.

@ljharb you might have opinions on this?

jmm commented Mar 11, 2016

@caridy

guys, please, stop thinking about the pragma detection. we have exhausted that conversation via different channels [...] please, stop it, and focus on realistic solutions.

That's what I thought from the original thread, but it's not remotely clear that this issue which was opened 2 days ago is saying that. This issue appears to invite that discussion by specifically referencing determining the mode given a source string and alluding to the same drawbacks to that concept that you're referencing, that were brought up in the original thread. If that's just your personal opinion, ok then. Otherwise please clarify the messaging.

Collaborator

bmeck commented Mar 11, 2016

@jmm it is dead. no parsing. it was dead in the original PR comments as well. The edge cases mentioned and added burden for toolchains is too much.

jmm commented Mar 11, 2016

@bmeck Please clarify that in the OP here. This appears to invite that discussion.

caridy commented Mar 11, 2016

Narrowing down the use case

The whole discussion here is due to the fact that we might have components that are mixing CJS and ES Modules, and our estimate (from few of us) is that this is actually a very minor use-case. To help with that, lets try to describe when you need to mix.

The NO cases

When we talk about mixing CJS and ES modules, we are explicitly talking about code in the context or node runtime, and that does NOT include:

i. files used by a runner can have an out of band configuration when invoking node node --module ./path/to/file.js)
ii. files who are consumed by a tool that enforces certain format (e.g. tests who are evaluated in a context of the test runner, etc).
iii. supportive files that are not intended to be required by other pkgs, in fact, those files are probably not in the npm pkg itself.
iv. an ES Module that contains require() or module.exports for some reason or another.
v. fat packages to support other runtimes (a package that contains a transpiled version of the original source for old versions of node, for the browser, for nashorn, for bower, etc.), the reason why this is not important is because this process is a mechanical process that can produce out-of-band configuration for each generated files.
vi. files without package.json. this is a very edge case, and if your code is not suppose to be shared, you probably have full control over how those are going to be digested by node runtime.

We might have other cases that fits into this bucket, but I think you get the idea. You, as the author the pkg, have full control over how the file is used, and when it is used, it is easy to solve that with an out of band configuration, and we should not care much about that for now.

The cases

1. Gradual transition to ES Modules

In the middle of a refactor, developers might end up in a situation where they have part of the module using ES Module format, but still using CJS for some pieces of the package.

In this case, there is another important question: will those modules be in the same folder structure?

note: I haven't seem evidence of mixing files in the same folder when transpiling with Babel and co.

2. Missing capabilities in ES Modules

ES Modules are suppose to be a superset of CJS, but at the early stage we might have missing capabilities, things that can only be achieve when using CJS, this will force early adopters to keep some CJS modules in order to achieve certain tasks that otherwise will be impossible by using ES Module format only.

@the problem

The problem we are trying to solve here is how to signal the format of those files that should be parsed and evaluated by node runtime, while trying to avoid a huge tax on authors. And as today, we have two buckets on the table:

i. detection by path (a decorator on the filename, a decorator on a folder name or a custom extension)
ii. out-of-band configuration in package.json (e.g.: "module": "path/to/module.js")

Next steps

  • To determine how common is the case where you will have mixed formats in the same folder.
  • To expand on the out-of-band configuration to describe the workflow (WIP).
  • To assess the cost of using and out-of-band configuration for very complex packages mixing formats.
Collaborator

bmeck commented Mar 11, 2016

vi. files without package.json. this is a very edge case, and if your code is not suppose to be shared, you probably have full control over how those are going to be digested by node runtime.

Like 1/2 of times I run node I don't have a package.json, not an edge case. Doesn't cover places that use files for config outside of your dir tree like ~/.app/config.js. Please stop calling it a very edge case. I will disagree heavily on this, and with increasing vigor.

Collaborator

bmeck commented Mar 11, 2016

note: I haven't seem evidence of mixing files in the same folder when transpiling with Babel and co.

Most likely since this is green code, the real problem is large existing code combined with the "Missing capabilities in ES Modules". In particular with extenrally mutable exports, circular dependencies, and top level await.

@caridy

guys, please, stop thinking about the pragma detection. we have exhausted that conversation via different channels, the number one goal of ES Modules is portability, and not all environments have the ability to analyze the code before it is parsed and evaluated like node can, and implements will NOT support a parser that mixes modules and sloppy mode scripts. please, stop it, and focus on realistic solutions.

  • Portability: ES modules are portable now for people using Babel. With a parsing solution the only change which needs to be made is to stop Babel from transpiling import/exports (along with any other changes in the require algorithm). And I'm not convinced an extension or package.json solution maintain portability.
  • Analysis: How many environments really need to know the difference in the exact same way node does? As I've mentioned before, I think the intersection of environments without a parser and tools which need to know the difference in the same way as node is small. For the small intersection what @ljharb describes as "snowflake" solutions should be sufficient considering these tools already have their own unique configuration formats.
  • Mixing: I don't entirely understand this argument, it may be a blocker but I can't be sure at the moment.

The argument for a parsing solution is that it is the best out of many bad solutions. Edge-cases for a parser can be solved, performance hits can be optimized, and there are adequate alternatives for the tooling which needs it. Compare this to the far reaching negative technological and cultural impacts of jsm and the complete isolation of certain usecases with the package.json approach.

I want to reiterate that the parsing solution mirrors most closely what modules would look like in node if implemented from the beginning. This is important for developer experience and for our eventual migration to an ES module only ecosystem with the lack of any vestigal structures.

Collaborator

bmeck commented Mar 11, 2016

@calebmer the 2 parsers have ambiguities if done on the same source text, that is the edge case. It cannot be solved. Tribal knowledge to unroll ambiguity is not acceptable. End of discussion on parsing from a technical perspective. As @ljharb mentions parsing is prohibitive to toolchains. End of discussion for ecosystem. Do not continue discussion on parsing, if you wish to please comment on the original PR but it cannot produce ambiguity and it must be non-prohibitive to toolchains at the very minimum.

caridy commented Mar 11, 2016

@calebmer

And I'm not convinced an extension or package.json solution maintain portability.

elaborate

How many environments really need to know the difference in the exact same way node does?

all

aside from that, I recommend you to look into all TC39 notes related to ParseModule and ParseScript, then talk to implementers, get feedback from them about a unified parsing process that mixes sloppy mode, strict mode and module semantics all together, after all that, if you think it is still doable, let us know! :)

@caridy

detection by path (a decorator on the filename, a decorator on a folder name or a custom extension)

What is a decorator here?

@zenparsing

What is a decorator here?

I believe this refers to something like .jsm being somewhere in the file path. "Extension" is a very specific place in the file path (the end).

@caridy

I apologise for my part in continuing the pragma discussion.

As for file path decorators and out-of-band settings (package.json, node CLI flag, etc):

Why don't we have both?

We could wait until Node 8 to deprecate the less popular one, and Node 10 to drop it, if we had evidence that a winner had finally emerged.

RE: out-of-band: this does seem to be what the browser folks will be offering, with <script type="module">. A node CLI flag would cover cases that where a glob in a package.json wasn't appropriate.

caridy commented Mar 11, 2016

@zenparsing decorators:

foo.m.js or foo.jsm or path/m/foo.js or anything else that, by looking at the path, can hint what parser should be used.

@jokeyrhyme asking for both is fair enough, in fact, we have discussed that in the pass briefly. I asked for a mechanism that allow pkg author to hook into the loading mechanism to specify what parser to use per file. This mechanism does not exists today, the only mechanism that exists today is a loader extension which affects the entire process, we will need artifact, per pkg and/or folder, that can be used by node, and whenever a file inside that folder structure needs to be inspected, a function call of some sort will have to be executed, passing the path, as a result, it return the type. something along those lines, which means people will likely create abstractions for the cases where they mix CJS and ES. It might worth exploring it.

Collaborator

bmeck commented Mar 11, 2016

@jokeyrhyme all solutions will require the cli flag, which is why there is a note in the issue head about it not being terribly relevant; it is only there to patch a specific case, it does not work as an interop at scale.

@caridy Gotcha.

It feels like both solution candidates have unresolvable issues:

  • package.json requires the file system to be "marked up" with flags, and for the platform to traverse the file system in order to gather those flags.
  • .jsm requires cultural acceptance of (what boils down to) deprecating .js for new code.

Seems like a stalemate. Maybe it's time to take another look at the default.js solution? ; )

ljharb commented Mar 12, 2016

@zenparsing not so much a stalemate if concerns about the package.json approach aren't addressed :-/ it will be much better for everyone if this has consensus, but consensus isn't necessarily required to force the issue.

@ljharb Understood, but forcing the issue would prolly be not so good. : )

Anyway, at the risk of being annoying here's an example of a "fat package" when using the default.js approach: https://github.com/zenparsing/zen-observable

Regarding my earlier proposal: I've added a note in it to verbosely explain that 1.i would still support file.jsm, as well as file.jsm.js, and file.jsm.jsx, considering recent terminology — it can be considered a .jsm filename decorator, rather than a .jsm filename extension

@jokeyrhyme

@balupton

My own experience with the .jsx extension is that libraries and tools I used were updated to support it very quickly. Some of the slower moving libraries and tools were simply abandoned. Surely we could expect this with a .jsm final extension, too, without needing .jsm.js.

Regarding sprockets, if there is demand for .jsm then I'm sure it'll be updated quickly.

Regarding .htaccess and other HTTP server MIME type settings, etc, I imagine we'll still be deploying ES5 code to production for a long time yet, even with HTTP/2. So surely we have enough lead time for the necessary server software to be updated.

I fail to see how this argument is suitable. My point was that the .jsm decorator proposal does the minimum burden on the ecosystem, while hitting all the goals. Your point seems to be that while the .jsm extension doesn't do the minimum burden on the ecosystem, it may not be that bad from your experience. I fail to see how arguing against a point of "minimum burden" with "more burden" makes sense.

@zenparsing

.jsm requires cultural acceptance of (what boils down to) deprecating .js for new code.

Not correct, my earlier suggestion means .jsm can be used without deprecating the .js extension, and without the need for a package.json modification, and without the need to force immediate tooling changes.

my earlier suggestion means .jsm can be used without deprecating the .js extension

Who wants to type in .jsm.js or even .m.js for all of their files? I certainly don't, and it amounts to roughly as much of a developer burden as a per-file pragma.

Who wants to type in .jsm.js or even .m.js for all of their files? I certainly don't, and it amounts to roughly as much of a developer burden as a per-file pragma.

Then just use file.jsm instead of file.jsm.js, that too is supported by the .jsm decorator proposal.

However, if the developer's environment, tooling, server, whatever doesn't yet support file.jsm directly, then typing file.jsm.js would be the less costly solution for them, the one they want to do.

It would be the one I want to do, as it would work with all my existing tooling without any changes to them.

If you support both .jsm.js and .js, then in addition to the (rather large) problems that each faces individually, you also add the burden of useless choice. The "either-or" in this case is therefore actually worse.

It would be the one I want to do, as it would work with all my existing tooling without any changes to them.

I'll make the stronger claim that only a minuscule amount developers will want to use such a long extension.

If you support both .jsm.js and .js

Do you mean "If you support both .jsm.js and .jsm" ?

I'll make the stronger claim that only a minuscule amount developers will want to use such a long extension.

Aesthetics choices are only possible if they are available. If their situation doesn't allow file.jsm they have no choice but to file.jsm.js.

ljharb commented Mar 12, 2016

@balupton using .jsm.js would not work as cleanly with require.extensions, nor with Rails' Sprockets, both of which only look at the last extension.

Member

domenic commented Mar 17, 2016

.m.js seems better than .jsm.js.

@domenic I was thinking the same, but I was a little hesitant to mention anything with multiple extensions because there seemed to be some practical objections with that in the first place. (Not that tools couldn't be fixed, but that's another deal.)

@balupton using .jsm.js would not work as cleanly with require.extensions, nor with Rails' Sprockets, both of which only look at the last extension.

These sound like much easier solvable problems than the alternatives.

Collaborator

bmeck commented Mar 21, 2016

@balupton @isiahmeadows @domenic my fear is that this means any tool that only parses the last .[^.]+ characters to grab the file extension will need patching, I am unsure how many servers are currently doing this and if this would affect any of the major ones. I am ok with patching node to match this, but this is not a common case for how file extensions are declared and also may have heavy ecosystem impact.

@bmeck I get that. To be perfectly honest, I'm more of a bystander in this discussion, anyways. As long as I'm not having to add a "use this new, horribly-complicated-to-implement module syntax"; at the top of every single file using ES6 modules, I'm good.

(By the way, I do like ES6 modules. Don't get me wrong. 😉)

Owner

ChALkeR commented Apr 4, 2016

@bmeck
So what about the exact extension? What alternatives do we have atm? I remember someone also mentioned .es, for example.

I'd like to offer a "why not both?" solution (see jokeyrhyme's last comment).

Analogously to the existing package.json main key (which defaults to index.js when a folder is require'd but has no package.json, or existing package has no defined main key), I propose that whatever glob key would be defined would have a default value that is essentially **/*.[some module extension].

This means that a package author may define a glob, if needed or desired, but may also use modules without defining a custom glob by following the default glob, which is essentially some chosen file extension.

Having read through this thread, I hope I didn't miss any major technical shortcomings or disqualifications. Here goes, as JS pseudocode, for clarity and specificity:

const path = require('path')

// assuming these exist, are performant, cached as appropriate, etc.
import { findPackageForPath, loadPackage } from 'core'
import { satisfiesGlob } from 'glob-utils'

// not critical what these are ultimately defined to be
import { MODULE_GLOB_KEY, DEFAULT_MODULE_EXT } from 'bikeshed'

const DEFAULT_GLOB = `**/*.${DEFAULT_MODULE_EXT}`

/**
 * Determine if a file at a given path is an ES module, and should
 * be parsed as such, without inspecting the file's contents.
 * @param  {string}  absolutePath - full resolved path to file (i.e. /my/app/path/cool-module/index.js)
 * @return {Boolean} - true iff file at path should be parsed + treated as module
 */
export function isPathModule(absolutePath) {
  let packagePath = findPackageForPath(absolutePath)
    , packageObject

  if (packagePath != null) {
    packageObject = loadPackage(packagePath)
  } else {
    // implicit package at folder (a la index.js)
    packagePath = path.dirname(absolutePath)
  }

  let glob

  if (packageObject != null && MODULE_GLOB_KEY in packageObject) {
    // use package-defined glob, when available
    glob = packageObject[MODULE_GLOB_KEY]
  } else {
    // use default
    glob = DEFAULT_GLOB
  }

  return satisfiesGlob(absolutePath, glob, {
    relativeTo: packagePath,
  })
}

In cases where infrastructure cannot parse JSON, application authors simply need to respect the chosen default convention. In cases where a new file extension would cause immediate burden, a custom glob may be defined.

I think this covers the concerns raised in this thread without being too complicated or potentially expensive at runtime, assuming the package locations/objects and glob definitions can be cached per-directory during process startup and module load.

Again, apologies if I missed anything while reading through the thread that would disqualify any or all of this. 😅

Owner

Fishrock123 commented Apr 4, 2016

In case it didn't propagate to here, at nodejs/node#5866 most of the CTC preferred a module extension (discussed was .jsm).

Owner

ChALkeR commented Apr 4, 2016

@Fishrock123 I don't think that we settled on the specific extension, though.

Collaborator

bmeck commented Apr 4, 2016

correct, tc39 met last week and I was holding any discussion until I get the meeting notes.

Owner

Fishrock123 commented Apr 4, 2016

Correct, we didn't settle on an extension, but we did discuss the .m.js thing two or so weeks ago and preferred .jsm to that iirc.

Owner

ChALkeR commented Apr 4, 2016

*.jsm has the downside that it's already used for JS modules that are not ES modules: https://developer.mozilla.org/en-US/docs/Mozilla/JavaScript_code_modules.

Those are also present in npm packages.

On the other hand, *.es is audio/echospeech, but it's very old, unregistered, and is probably not present anywhere (or is it?).

Collaborator

bmeck commented Apr 4, 2016

@ChALkeR looks like most of the .jsm are coming from source-map or specific dist for firefox addons.

$ grep source-map < jsm.txt | xargs basename | sort | uniq
SourceMap.jsm
Utils.jsm
prefix-source-map.jsm
prefix-utils.jsm
suffix-source-map.jsm
suffix-utils.jsm
$ grep -v source-map < jsm.txt | wc -l
      83

So looks like ~89 are in npm, and those source-map files are not in the current source-map repo

Owner

ChALkeR commented Apr 5, 2016

@isiahmeadows Wow. Thanks, I missed that!

Alternatively, .esm for ECMAScript Module, if there are backwards-compatibility concerns with other suggestions. /shrug
I doubt we'd have much confusion or compatibility issues within our own community regarding Elder Scrolls files that also use the .esm extension. :P

ljharb commented Apr 5, 2016

.m.js is a nonstarter imo for reasons already discussed, but .es, .esm, .jsm, are all fine.

@ChALkeR Welcome 😄

I think we should go for the .jsm extension. It's very simple and clear and I don't see the problem about the existing tooling detecting .js extension files. If they won't support the .jsm extension they probably won't parse the file correctly either way.

Collaborator

bmeck commented Apr 15, 2016

.jsm apparently can have access to sensitive information within Firefox if loaded via a file:// url, taking it out of the running / .mjs can be proposed instead and keeps js.

Due to the prolific use of JS in project names keeping js leans me away from .es culturally: DailyJS, Node.js, Cylon.js, Three.js, DynJS, VorlonJS, etc. That said a new trend of referring to ES, ESNextNews, ES6, ES2015 makes this less of a concern for me.

RegExp and Glob lean slightly towards .es: \.[ej]s works both with basic globbing and regular expressions as a single pattern; .mjs cannot be easily represented in globbing alone since there is no optional operator for the m.

I do have concerns as well if .es is chosen as people will snatch up .es TLDs. Which may have some impact, but I don't know if that should even be considered when talking about this.

Updating EPS to be .mjs and re-submitting that it become a draft.

jmm commented Apr 15, 2016

For globbing *.{m,}js seems to work for me in Bash. Not sure how universal it is though.

caridy commented Apr 15, 2016

wow, we are getting down the rabbit hole with the extension proposal.

btw, our timeline to share the counter proposal should be ready early next week.

Collaborator

bmeck commented Apr 15, 2016

@caridy rabbit hole?

Collaborator

bmeck commented Apr 15, 2016

In the event of a counter proposal, we will still need to verify any assumptions / impact it may have even outside of node itself. Such scrutiny will be applied to any direction taken. Support for other tooling / environmental implications / developer experience / potential exploitation are not to be considered a rabbit hole.

Owner

Fishrock123 commented Apr 15, 2016

I don't really see why something locally in a web browser should effect us, where will that confirm-ably overlap?

Once <script type="module"> takes off, people may start using Node idioms
on both ends to differentiate traditional scripts from newer ES6 modules,
which could become relevant for servers.

On Fri, Apr 15, 2016, 13:58 Jeremiah Senkpiel notifications@github.com
wrote:

I don't really see why something locally in a web browser should effect
us, where will that confirm-ably overlap?


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#13 (comment)

Collaborator

bmeck commented Apr 15, 2016

@Fishrock123 what @isiahmeadows said and also the fact that some test runners do in-browser testing generally over the file:// protocol: https://mochajs.org/#running-mocha-in-the-browser

+1 to renaming the language MichaelJacksonScript.

@bmeck I'm not sure how relevant file:// vs http(s):// is. IIRC the protocol used wouldn't make any difference in how it's interpreted. Could you elaborate?

Collaborator

bmeck commented Apr 20, 2016

@isiahmeadows please see the note in the EP ( https://github.com/bmeck/node-eps/blob/es6-module/002-es6-modules.md#reason-for-decision ), Firefox loads .jsm files differently when you double click them, open them via cli, etc.

So for applications & packages written in ES+Babel now with .js extension, what is their expected upgrade path (dev & CI environments) to use native ES6 modules in node?

Also, for applications & packages using CJS modules in node 4+ with ES classes and where majority of the code is already using existing ES features in node 4+, what does their upgrade path look like?

jokeyrhyme commented Apr 21, 2016 edited

@Jeff-Lewis

So for applications & packages written in ES+Babel now with .js extension, what is their expected upgrade path (dev & CI environments) to use native ES6 modules in node?

  1. rename all module files from ".js" to ".mjs"
  2. change all relative import paths so that they specify extension e.g. import './blah.mjs';
  3. change all package import paths so that they are package name only
    • import 'package'; is allowed
    • import 'package/relative/path.mjs'; is not allowed
    • this may mean requesting upstream package maintainers to properly export inner values of use, or to publish separate packages
  4. remove Babel and cross fingers that everything still works
  5. you may leave any existing require(); statements, as they'll continue to work as before

Also, for applications & packages using CJS modules in node 4+ with ES classes and where majority of the code is already using existing ES features in node 4+, what does their upgrade path look like?

CJS modules that work in Node 4 and use require() will keep on executing just fine, just like they have all through Node 5. There is no forced migration for existing CJS code. An optional migration path might be similar to the above steps with the addition of:

  1. replace module.exports = { /* ... */ }; with export default { /* ... */ };
  2. replace require('blah'); with import 'blah';
  3. replace const blah = require('./blah'); with import blah from './blah.mjs'

@jokeyrhyme

  1. rename all module files from ".js" to ".mjs"
  2. change all relative import paths so that they specify extension e.g. import './blah.mjs';

Only if an explicit .js extension was already provided.

  1. change all package import paths so that they are package name only
  • import 'package'; is allowed
  • import 'package/relative/path.mjs'; is not allowed

That is allowed. package/relative/path will resolve regardless to package/relative/path.js, etc. now, and changing that would break a lot more than even the left-pad debacle. I take advantage of that myself.

... There is no forced migration for existing CJS code. ...

It'll happen, though. ES6 modules are far superior, IMHO.

...
3) replace const blah = require('./blah'); with import blah from './blah.mjs'

You can still use import blah from './blah'. Although, you might have to use import * as blah from './blah' instead, if the exports are named, and not default.

jokeyrhyme commented Apr 21, 2016 edited

@isiahmeadows @Jeff-Lewis Oops? I was under the assumption that the module resolution folks were voting in favour of making Node.js strictly match browser behaviour, with no resolution magic

Probably best following up those comments in the other thread, as it might be beyond the scope of "module detection" to discuss that any further here.

@jokeyrhyme I'll take that back, but only partially. That's still being discussed, and one of the biggest concerns about requiring explicit extensions and paths is that people are already depending on Node's algorithm with Babel when targeting either platform.

Owner

mikeal commented Apr 21, 2016

@bmeck heya, you should submit a talk about all this for Interactive :) https://www.conferenceabstracts.com/cfp2/login.asp?EventKey=VSIHQOEU

rektide commented Apr 27, 2016 edited

This thread seems entirely devoted to how Node wants to carve a special case people need to opt in to to use ES6 Modules. Using an extension other than .js is hideous and not JavaScripty, and Node should bend over backwards to make itself maximally JavaScripty and not a special case demanding X Y or Z. As a developer in 2018, I will be writing JavaScript with JavaScript modules and those modules- used by front and and back end- WILL be .js files. If Node wants to make it hard to run actual JavaScript on Node, it should proceed with the current idea of requiring a special extension (.jsm): Node can go off and do that and then anyone who wants to use normal, regular code will have to "transpile" (rename the file) my actual JavaScript to Node JavaScript. But I suggest you avoid this obvious terrible horrific looking jank of carving out your own extension.

My suggested heuristic, in loose form:

  1. assume es6 m and parse/load/w/e. provide a global require() but no module. if there are no exports or execution errors, proceed to 2.
  2. assume commonjs and parse/load/w/e.

This favors a happy path, and falls back to the current set of assumption. Node should try to just do right. There are very simple heuristics to determine when the attempt to use actual modules (es6m) has failed, and then fallback and retry.

Using an extension other than .js seems like it will break, look jank, and be hideous, in the dual-sided world we're in. I haven't seen any solid arguments as to why explicit coaching is required from node. We should milk a source detect solution as best as we can, and only fallback to explicit coaching of node. And if node is going to be so crippled as to need explicit help, it should at least kindly read it out of the meta-data payload node already relies on, package.json, rather than stomp the normal signalling people use to indicate they're writing js.

as a developer, i would way rather write that looks good and normal than mutilate normal looking filenames for node's sake. i'd rather write packages broken for node that look normative and let consumers babel transpile my .js to your funky weird .jsm than distribute bizarre files. if node wants to not be JavaScript, it invites in the need to build chain JavaScript to run and consume npm packages.

I've mined the thread for what I can find about source-detect, and this is what I've rounded up:

I understand that there are a lot of objections to the idea of trying to inspect the source to detect the format, to the point that in the original thread people keep saying it's a dead issue, off the table.
@jmm http://github.com/nodejs/node-eps/issues/13#issuecomment-194984818

Do we need to know which mode we are in prior to parsing?
Which mode-sensitive use cases do not ultimately involve parsing?
@jokeyrhyme #13 (comment)

@jokeyrhyme Problem is, many CommonJS modules require that their source is parsed as sloppy, or they just don't use the pragma.

@isiahmeadows

Problem is, many CommonJS modules require that their source is parsed as sloppy, or they just don't use the pragma.

My "what can we ship now?" suggestion/question side steps that by forcing module/strict parsing on anything that is imported via the built-in Node.js implementation.

So options that people are using now that would continue to work would be to use require() or something that compiles imports into require()s (e.g. babel).

ljharb commented Jun 2, 2016

@jokeyrhyme anything partial we ship now will actually end up making it harder to ship something else later, because packages will be published that rely on the "shipped now" behavior.

@ljharb Go 1.5 shipped with an experimental vendoring solution (now stable in 1.6): https://medium.com/@freeformz/go-1-5-s-vendor-experiment-fd3e830f52c3

Also, we previously had a --harmony flag for Node.js.

Any chance we could get something basic and experimental in, hidden behind a flag? Or have a hook enabled via a flag, so that the different proposals can be poly-filled or something?

ljharb commented Jun 2, 2016

Both proposals can be polyfilled right now, in every version of node, via monkeypatching require.extensions. It wouldn't be performant, but it would simulate the ergonomics. I don't think anyone has built a polyfill yet for either approach.

rubennorte commented Jun 2, 2016 edited

@isiahmeadows

@rubennorte I explained in this comment on the original PR for the proposal why it's not helpful for transition.

TL;DR: think: named child modules like lodash/match - there's no easy migration between the two.

That proposal already covers that. Also, if it weren't covered you could keep using require('lodash/match') and migrate to import { match } from 'lodash'; when the author migrates the package to ES modules.

@ljharb but import / export will throw syntax errors, right? Or do you mean poly-fill require.extensions so that the entry point is in script mode but require()s in that entry point file support import / export syntax and/or be module/script as needed?

Perhaps this could help: I took the time and tried to summarize the information I could comprehend here: https://github.com/martinheidegger/es6modules-nodejs

I would love it to brush this up and ask the broader Node.js community for input/help.

ljharb commented Jun 2, 2016

@jokeyrhyme you'd need to use something like babel on-the-fly when requiring files that were modules, since import/export syntax isn't supported - which is how babel-node works.

shelby3 commented Jun 2, 2016 edited

It occurred to me that with a file extension proposal, it is plausible that the module field for import could be provided without an extension, thus mitigating my criticism about the loss of single-point-of-truth when needing to annotate every import with the distinction of CJS or ES6. Single-point-of-truth (i.e. "self contained") was one of the fundamental design principles directing me to favor a use module; pragma.

However, all other factors weighed, I prefer not to make a proposal dependent on eliding the file extension in order to achieve single-point-of-truth, because the file extension eliding can be an orthogonal concern. Another fundamental design principle I adhere to is trying to maximize degrees-of-freedom when there is no overpowering negative cost to doing so.

However that still doesn't ameliorate the other disadvantages of a file extention proposal. Also I added other disadvantages upthread, including the requirement for the module to maintained with two separate file extensions since not all use cases will employ the .mjs strategy. And we can't expect that every repository platform will provide some special coding to accomodate dual file extensions for the same version control system. The salient problem with the file extension proposal is it infects/breaks too many orthogonal systems. Whereas, the use module; pragma doesn't infect/break anything at all! Zilch. Zero negative impact footprint! For example, I don't have to fight with my editor which doesn't understand that .mjs is supposed to syntax linted as JavaScript. Yet also the pragma proposal doesn't prevent my editor for adding functionality in the future to recognize the pragma and insure it is compiling the module as use strict;. I can even include a redundant use strict; to support tools that don't yet recognize the preceding use module;.

That is why from my current knowledge of the issue, it appears to be a "no brainer" decision. And that is why I became rather vocal and flabberghast about there even being any argument.

In general, files should identify their format via content and not context, so that the attributes are carried with the file ven in the context is not, i.e. a single-point-of-truth.

I understand that parsing use directives would be a burden which is the only valid reason given for not adopting use module; as the solution. So that is why I counter proposed that instead the convention can be that use module; must be the first syntactical token in the file, i.e. it can only be preceded by optional whitespace and/or comments.

Of course I don't relish being perceived as boorish, overbearing, condescending, or rude. I admit I was put off by the comment that alleged I had no valid argument (which I astutely perceived was sort of groupthink political attack against me because I was criticizing the logic of @bmeck), which then caused me to feel more competitive and I crushed the logic of the person who wrote that.

Bottom line is I think crowd sourced design can be good at collecting all the community information around an issue. And crowd source design can catch errors in thought processes that one person or a smaller groupthink might miss. But the downside of crowdsourcing, is that an astute person who attempts to cut directly to the clear conclusion, may be perceived as against the groupthink theme of crowd sourced wisdom.

Not. There are tradeoffs. I just want to see us not slide into lack of focus and most relevant priorities. This sometimes becomes difficult to do without a leader. And not that I want to be that leader on this issue. I was just sucked into this by a relevant need I have. Apologies to be so verbose, but just this one time so my demeanor is clearly explained. I don't intend to cultivate a political reputation, because I think politics detracts from efficiency and correctness of outcomes. Ultimately an alpha-male benevolent dictator rises and success results, e.g. Linus Torvalds. It isn't going to be myself on Node.js. Not my priority work set. If @bmeck wants to be the that dude, he needs to step up his astuteness to a higher level. When you see males do this sort of thing, it is because we want leaders who are as good at what they do as Linus. And I think we all know how abrasive he can be when he thinks someone has poor logic and reasoning skills. I am more patient than he is, until I see that groupthink political crap starting, then I go off the rails.

Owner

MylesBorins commented Jun 2, 2016

@shelby3 I have worked quite hard to not interact with you in this issue, primarily because you do not come off as acting in good faith. I am going to take a few snippets from your last comment to highlight why people may feel this way

"it appears to be a "no brainer" decision"
"which I astutely perceived was sort of groupthink political attack against me"
"I crushed the logic of the person who wrote that"

Now interestingly enough you state that you don't relish being perceived as boorish, overbearing, condescending, or rude. The way you have behaved in this thread, not political group think, has made people not want to collaborate you. Simply put @shelby3 this is not a perception... this is a reality.

If you want to contribute to this project and have a vested interested in the future of the platform I suggest you take some time to examine the way you have behaved in this thread and come up with ways to improve. Trying to explain your demeanor in no way makes it appropriate.

Second question: what about languages that compile to ES2015 modules?

In particular, this will immediately include Babelified, TypeScript, and some CoffeeScript projects (some do ES6 imports in backticks), but it will also include any other compile-to-JS language that at least plans to support ES2015 modules. This is more orthogonal of a concern, but it can indirectly affect both design and implementation of this feature.

ljharb commented Jun 3, 2016

@isiahmeadows either approach (extension or manifest) will work just fine with any transpiler/compiler process imo, just like both would work with a polyfill - one might be slightly more complex to handle than the other, but it's pretty negligible, and I don't think it impacts the approaches either way.

mscdex referenced this issue in nodejs/node Jun 11, 2016

Closed

the Roadmap for v7.0 ? #7273

My concern for using a new extension (.jsm||.mjs) is what about lazy loading individual files later on after compilation through browserfy or webpack, say, in the browser? We will have to have both a .js and a .jsm||.mjs. That or you will have to specifically change all importers to now pass more parameters to say, "hey, this file needs to be requested via .js but this one needs to be requested with .jsm". That doesn't sound ideal at all. I really think we should look at other options instead of a new extension.

ljharb commented Jun 26, 2016

@tsteuwer you'll already need that, because in one browser modules will be supported, and in another, they won't - whether you're keeping .mjs + .js, or module.js + script.js, you will still always need two files, and a way to differentiate them.

shelby3 commented Jul 31, 2016 edited

@thealphanerd how can expressing my analysis of the facts be "not acting in good faith" in any case. Your reply to me is just playing politics in a crapbucket mentality, so I can only wish you the utmost enjoyment of your political mess. Obviously you didn't like my attempt to say that I am going to speak the facts even if they offend others, and that it is not my preference to offend but when it comes to a choice of BS or truth, I'll choose the latter when it is important to do so.

Any time you want to actually debate the facts instead of "worked quite hard to not interact with you in this issue", you might be useful to the process of excellence.

Owner

jasnell commented Jul 31, 2016

@shelby3: such comments are better left to private email or Twitter and do not serve to move the conversation forward here.

barretlee referenced this issue in barretlee/pocket Aug 23, 2016

Open

nodejs/node #19

RobertWHurst commented Dec 16, 2016 edited

New a extension seems like a poor choice for the future of the platform. This is the kind of decision is huge, and defines the future of the platform. I suggest careful thought here (clearly there's been a lot already 😃). One of expectations of adding the ES6 module system is to bridge the gap between the browser environment and node. I'd argue adding a new extension undermines this aspect of the ES6 module effort. Adding a new extension will make it difficult to share code across the browser and node as the browser still expects .js as the extension. I think we can do better here. This is hard I know, but isn't a better solution worth it? Shouldn't this change be architected for a long term future? One where browser js and node js are most compatible. Let's not fracture this community, lets make sure we can continue to share code across our environments. Looking forward to seeing where we go from here. Thanks!

ljharb commented Dec 16, 2016

@RobertWHurst extensions have no relationship to the browser, as browsers have always ignored them - and the "whole point" of ES modules wasn't "to sync with the browser environment", since browsers had no concept of modules prior to this. Any mechanism (of which "extension" is one) that helps serverside tooling distinguish Modules from Scripts actually makes it easier to share code with the browser, long term.

RobertWHurst commented Dec 16, 2016 edited

@ljharb Ok, thanks for straightening me out on that. To clarify, I meant the effort here of implementing to ES6 modules within node, not ES6 modules in general. I was a bit unclear there, sorry about that.

Ah ok. I didn't realize the extension was ignored in the browser environment. So if one were to require a module with an import statement in the bowser, would the browser be able to resolve that module if it has the new extension?

import foo from './my-npm-module' // trying to import my-npm-module.jsm

Any mechanism (of which "extension" is one) that helps serverside tooling distinguish Modules from Scripts actually makes it easier to share code with the browser, long term.

I'm not sure what you mean by this. How might this help share code with the browser? Don't we want to see the use of modules rather than scripts across the board eventually? Hopefully my question makes sense. Thanks for your time 😄

ljharb commented Dec 16, 2016

@RobertWHurst there needs to be a way for the server to know when to set a <script> tag to type="module" or not (or, in node's case, to know which parsing goal to apply to a file). I'd love to see us in a world of only Modules - but to get there, we'll have to have a very long time with a smooth transition path. "extension" is one proposed way for a server to make that decision (there are others, like using a manifest file (like package.json); or like changing the ES spec so that files unambiguously parse as a Module or a Script, and you'd parse the file to determine what kind of thing it was) - but no matter what, there must be some way of determining that information.

RobertWHurst commented Dec 16, 2016 edited

@ljharb I gotcha, that makes sense.

But if we use the extension to determine the module system don't we loose the ability to load our modules directly in the bowser via import statements? Perhaps I miss understand. Heres an example of what I mean. Lets suppose async is updated to use the jsm extension. A developer exposes async's source then tries to load it in the browser as follows. Wouldn't this break?

import parallel from './async' // trying to import async.jsm

ljharb commented Dec 16, 2016

@RobertWHurst in the browser, you have to import from URLs - so either you'd have to use a server rewrite rule to transparently turn an /async URL to /async.mjs or whichever, or, you'd use a tool like babel to rewrite ./async on the server to ./async.mjs or whichever on the client - or, you'd just type the extension into your code directly and have no rewrite step anywhere (but I don't expect anyone to actually do the latter in practice).

RobertWHurst commented Dec 16, 2016 edited

@ljharb Right, ok. I see. Thanks for explaining that 👍

This is exactly what I'm afraid of. I do really think this is a problem for isomorphic js. I suppose people could be encouraged to add the extension, but as you say, I'd imagine they likely won't. Perhaps my alarm here is unique, but isn't this a bit troubling? One could argue that build tools would be involved. They are required today for shared code. I guess I was just holding out for the day we could retire some of the complexity around front end tooling in this regard.
Ah this module detection is a 'fun' challenge hey? 😉

As an aside, I suppose it's to late to get the browser vendors to adopt the new extension for modules as well. 😜

Collaborator

bmeck commented Jan 13, 2017 edited

@cebor it fully supports the parser and runtime, it does not declare any interop strategy or detection of when to use the Script vs Module parser when you get a source text (that is up to Chromium if you look at that issue).

chyzwar commented Jan 26, 2017

But It should possible to ship modules without into node?
Interop can be added as long you node have it defined?

Can we have some kind of wiki for this issue?

Collaborator

bmeck commented Jan 27, 2017

@chyzwar not really, you couldn't do anything like import 'fs';. There have been several attempts to maintain a wiki in the past. https://github.com/nodejs/node/wiki/ES6-Module-Detection-in-Node was one from a while ago.

chyzwar commented Jan 27, 2017 edited

I could by creating adaptor modules like. Adaptor modules can be created in user-space and be temporary fix before interop arrive.

fs.js

export default require('fs')

In most cases I do not need node.js native modules. In typical express.js application most of this is already abstracted. It is very rare that you need to call something outside path/fs modules.

Issue I am facing now is that you need to use webpack/babel/node-esml to write ES6 code where you loose errors stacks/sensible debbuging ...

Thanks for wiki :)

If we do go with .mjs, will that be preferred over .js for ES imports? Also, will .js be preferred for CommonJS requires? If both of those end up the case, you could create proxy entries internally for the built-in modules, like this:

// child_process.mjs
import child_process from "./child_process.js"; // force local resolution somehow
export default child_process;
export const ChildProcess = child_process.ChildProcess;
export const fork = child_process.fork;
export const exec = child_process.exec;
export const execFile = child_process.execFile;
export const spawn = child_process.spawn;
export const spawnSync = child_process.spawnSync;
export const execFileSync = child_process.execFileSync;
export const execSync = child_process.execSync;

@isiahmeadows when you say "preferred", do you mean by an algorithm? Or by the silly fleshy creatures that type in the imports and the require()s?

Personally I like the .esm prefix over the .mjs (it feels both easier to read and more intuitively means "ECMAScript Module" which).

Has there been any developer polling on which prefix would be preferred, and have multiple extensions been considered?

Collaborator

bmeck commented Feb 15, 2017

chyzwar commented Feb 22, 2017 edited

@bmeck Would be possible to change proposal for gradual adoption of ES Modules.

  1. Stage. Implement ES modules without interop with CJS
    a) if you use import you expect a ES Module, if not found throw error
    b) if you require you expect CLJ module
    c) both systems live in separate namespaces.
    e) it is still possible to use require inside es6 module

  2. Stage. Introduce node native ES6 module adapter layer.
    a) when using import have a lookup dictionary for node native modules.
    if for example path is imported, adapter dictionary will point to es6 module adapter

  3. Introduce inter-op between ES Modules and CLJ.
    a) when importing, resolve can load CLJ/ES transparently

  4. Stage. Remove adapter layer if inter-op is stable.
    a) at this stage interop should be stable and adapter latyer can be removed
    b) start process of depreciation on CLJ, at some point we want only one module system

There is no need for new extension. Also in my opinion that perfect situation to improve main in npm to support root directory, It would be nice if we can "mylib/function" instead "mylib/lib/function". This would a separate issue.

ljharb commented Feb 22, 2017

@chyzwar considering that there are npm modules that will never be updated, but are still in heavy usage, what makes you think that a staged approach will change the interop requirements?

There should never be a time when node supports ESM but you can not seamlessly import CJS or require ESM.

chyzwar commented Feb 22, 2017 edited

I am not saying that introp requirements would change. At stage 3 node would provide full interop.

I am just suggesting that node can introduce es6 modules now. There is no need for year of discussions and another year or implementation work . Interop is enhancement and not essential feature of ES6 modules. To be honest CLJ should be retired at some point in future node20?, aim should be on unification of JS world where node.js is central piece.

There plenty of use cases where ES6 modules make sense even without interop: CLI applications, webpack config files, isomorphic applications, new projects etc. JS world is used to churn, adoption of ES6 modules could be faster than most people think.

jokeyrhyme commented Feb 22, 2017 edited

@chyzwar it was stated earlier that it would be undesirable for a consumer to have to know whether the package they are consuming was written in module or script mode. Ideally, they should be able to require() or import anything, and not worry about which mode it is in. Note that using import automatically puts the consumer's own code in module mode.

Of all the different facets (performance, compatibility, developer experience, availability etc) that need to be taken into consideration, I'm personally torn over which we can afford to prioritise and which we can postpone / ignore.

chyzwar commented Feb 22, 2017 edited

@jokeyrhyme your customers are developers and are not brainless. Everytime you consume library you will at least read readme installation and sample usage. And even if you make a guess you will still get descriptive error message. You can even add even more descriptive messages in debug mode....

You can still use require inside ES6 Module, since require return "something" and have no side effects in current module. I do not see that as issue. It is trivial to provide fat module with entry point src for ES6 and lib for compiled CLJ. Most popular libraries are already mostly written in ES6, only blocker at this point is node.

This issue is open for almost a YEAR with no significant progress. I was waiting for many months before commenting in this thread again hoping that there is going to be some work done.

ljharb commented Feb 22, 2017

Shipping nothing is better than shipping a partial feature when it has the potential to massively pollute the ecosystem.

Collaborator

bmeck commented Feb 23, 2017

@chyzwar Implementation has already started nodejs/node#11233 , contributors will be getting access to a prototype in next few weeks (which will most likely be buggy) and after cleanup a prototype build will probably show up. To note, the v8 C++ implementation of modules is not entirely stable/complete and import() which is important is not expected to land for a couple more months (so we could not ship today even if we wanted to). We can work around those somewhat in prototypes but don't want people using partial features like they did with babel and getting bitten harshly. That said, a prototype implementation which will need to be thoroughly tested will ship and take a while to land into somewhere that it could land in an LTS branch (probably experimental flag in 2nd half of 2017 and unflagged 1st half of 2018 is the plan).

bmeck referenced this issue in nodejs/node Feb 28, 2017

Open

tracking issue: async loader for module interop #11233

9 of 20 tasks complete
Collaborator

bmeck commented Feb 28, 2017

As per nodejs/node#11233 (comment) , a protocol was briefly mentioned at one point as node_modules: but this was never discussed in great depth.

I just came across a nodesource blog ES Modules and Node.js: Hard Choices by Rod Vagg (translation of an article by Yosuke Furukawa) – not sure how I missed it before – which seems to me the best summary I've seen.

If I understand that article well, the interoperability decisions mean that import './foo.js'; will fail if foo.js is an ES module rather than a CJS module, and vice versa for import './foo.mjs';.

I like the perspective that In Defense of .js prioritizes a switch from CommonJS to ES Modules for the longer-term future, and the Node.js EP prioritizes compatibility and interoperability.

ljharb commented Feb 28, 2017

@chrisveness which is why the best practice continues to be (and has always been) to do import './foo' and always omit the file extension in your require.

jpolo commented Mar 4, 2017

What about jsx extension ? This conversation seems very long and raises so many potential problems.
Thinking out of the box , maybe implicit import without extension should be dropped and only use explicit extension. Imho explicit syntax is the only fair solution when facing convention conflicts inherent to a fragmented language such as es script.

ljharb commented Mar 5, 2017

The jsx extension is why it's a best practice to omit extensions. If I rename a foo.js file to foo.jsx, or the reverse, consumers shouldn't have to be coupled to that choice. The same will apply when renaming between .js, .jsx, .mjs, .coffee, or whatever file extension you're playing with that day.

Really looking forward to native import export support! I keep running across use cases where using import is more eloquent then the commonjs require style.

cref commented Mar 11, 2017 edited

I'm sorry if I'm missing something here but why complicate matters by trying to make import and require load both module formats? Why not let the call site be explicit about what kind of module is being requested:

CommonJS module example

const cjs = require("cjs-module");
const es6 = module.import("es6-module"); // or maybe use require.import instead of module.import?
const es6r = module.import("./relative-es6-module");

ES6 module example

import es6 from "es6-module";
import require from "require"; // should become part of the Node.js API
const cjs = require("cjs-module");
const cjsr = require("./relative-cjs-module"); // might need some 'magic'

Yes, as a developer, you will need to know whether you must require or import a module but I don't think that's a problem. When a module gets required/imported, chances are the module is going to export something. So if you'd import a cjs module you'd get a syntax error and if you'd require an es6 module you'd get a reference error. There can be an edge case where a module is only loaded for its side effects and is not exporting anything which is IMHO bad design. In that case, there's a very small chance that incorrectly using require or import doesn't trigger an error. And even if it doesn't, there's an even smaller chance that the module's code has unintended behavior due to module format differences. In short, I don't think it'll be a blocking issue.

It's also trivial to write a wrapper module for either format so you could wrap all the cjs modules you need inside es6 modules and just use import everywhere. Library authors could provide wrapper modules (either cjs->es6 or es6->cjs) in the same package or provide a seperate package or new major version for the wrapped modules.

There will be a transitional period in which the use of module.import (or require.import) and import "require" will be common but after that, most cjs modules will have been upgraded to es6 and you will rarely be forced to require a cjs module from an es6 module anymore. (except when dealing with abandoned packages)

Modules that belong to the Node.js API could be made available in both formats in some way or another so require("path") and import "path" would both work. But maybe this would be a good time to introduce a namespace for those modules instead?

CommonJS Node.js API example

const path = require("path"); // cjs implementation or wrapper module for es6 implementation

CommonJS Node.js API example

import path from "node/path"; // es6 implementation or wrapper module for cjs implementation
import require from "node/require"; // implication of newly introduced node namespace

Calling an es6 script as the main script would require adding a flag for es6 mode or wrapping the es6 module. Possibly, the default behavior could be to autodetect the module format.

node -es6 my-es6-script # force es6
node -cjs my-cjs-script # force cjs
node my-cjs-wrapped-es6-script # run es6 script without requiring a flag or auto-detection
node my-autodetected-script # autodetect module format, default to cjs if inconclusive (and show warning?)

Compared to other proposals, this proposal doesn't introduce any added runtime overhead. (if you don't count the optional suggestions for using wrapper modules and main script format autodetection)

ljharb commented Mar 12, 2017

@cref because it must be (possible to make it) invisible to consumers whether you're using CJS or ESM, which means you need to be able to refactor from one to the other (in either direction) without forcing consumers to update their code.

cref commented Mar 12, 2017 edited

Which can easily be realized by wrapping modules. Either offered by the library author or by creating or generating them yourself. Also, updating a library to a new major version will always force you to update your code, it's what versioning is for.

@cref I think what we're trying to avoid is another Python 2 -> 3 situation, where compatibility is broken just enough to require work on everyone's part, the community is split for years, sharing of code is harder, etc.

Eliminating / minimising the impact on the community is extremely important

ljharb commented Mar 12, 2017

Right - changing the module format should be a patch or a minor, not a major - that's the point.

cref commented Mar 13, 2017 edited

Maintaining compatibility is the responsibility of the library authors, just as it is with regular changes to a library's API. What I'm trying to say is that it's peanuts to keep offering both flavors of a library to make both cjs and es6 developers happy. Say you have a library at version 3.x. You can then start a 4.x es6 version while still supporting 3.x without much effort. (either 3.x wraps 4.x or the other way around)
I know it's easier for Node.js users and library authors if it would 'just work' but what I'm saying is that this is apparently not an easy thing to accomplish. In the meantime, there will be no es6 module support. IMHO it's not worth pursuing this ideal at the cost of a suboptimal runtime or potentially no es6 module support at all.

@cref I'm sure everyone here completely sympathises with your viewpoint, as we all desperately want import in Node.js as quickly as possible

One other thing to note is that Node.js is not the only place where work must be done to support import and export. As far as I can tell, V8 itself does not currently have complete support for ES modules:

https://bugs.chromium.org/p/v8/issues/detail?id=1569

Node.js cannot support modules until a stable release of V8 includes the required foundations for it

@jokeyrhyme that's not entirely true we could rely on https://github.com/nodejs/node-chakracore

Oh the irony :)

Collaborator

bmeck commented Mar 13, 2017

ljharb commented Mar 13, 2017

@cref i think you're underestimating how important it is for it to be easy to support both. Citing Python 2/3 is appropriate here - if ES modules are not painless to migrate to, both for authors and consumers, people simply won't migrate en masse, and the ecosystem will suffer.

chyzwar commented Mar 13, 2017 edited

@ljharb @jokeyrhyme people are bringing Python case that is not relevant in this case. Python changed semantics of fundamental things in language: like how strings work, build-in functions print, integer division etc..Node is trying to introduce new module system according to existing spec. There no change in language or user-space code. It is pity that you cannot downvote everyone that mention python 2->3.

In fact by trying forcefully bring interop between CLJ and ES Modules, node will introduce break in language. It is ironic that node is trying prevent a case of Python fragmentation by introducing break that will inevitably bring fragmentation.

Explicit is better that implicit, when consuming a dependency developer should know format to use: require of import. By making this hidden will introduce a compatibility layer with subtle bugs and an extension that nobody outside node.js will use for theirs JavaScript. Since extension will be not present in import, I will never know when reading code what exactly is loaded.... Also by providing this compatibility layer there will be absolutely no incentive to migrate to ES6.

@chyzwar

In fact by trying forcefully bring interop between CLJ and ES Modules, node will introduce break in language.

Which of these options do you think is the most breaking for users?

  1. CommonJS and ES modules may import/require one another, without knowledge of the module type
  2. ES modules may only be imported, CommonJS modules may only be required

Here's what would happen:

  1. Changing module syntax could happen incrementally in large code bases. Libraries switching module syntax would be semver-minor, as it would only require changing to a namespace import for existing native ES6 module consumers only (not existing Babel-transpiled code). Existing dynamic requires still work.
  2. Changing module syntax would be semver-major, because you have to completely change how you load the modules. For one, you would no longer be able to dynamically import it until after the import() proposal gets implemented and shipped.

It is ironic that node is trying prevent a case of Python fragmentation by introducing break that will inevitably bring fragmentation.

I can tell you separating the two has already caused some fragmentation in the community. Consider lodash vs lodash-es, and other similar cases enabling Rollup and Webpack 2 to do its thing where you can't really do similar with transpiled CommonJS modules


Explicit is better that implicit, when consuming a dependency developer should know format to use: require of import. By making this hidden will introduce a compatibility layer with subtle bugs and an extension that nobody outside node.js will use for theirs JavaScript.

Webpack 2 tells a different story here. It natively transpiles both ES modules and CommonJS modules, with full interop. Additionally, it mostly targets browsers.

Since extension will be not present in import, I will never know when reading code what exactly is loaded....

Yet again, Webpack has allowed some seriously flexible dependency chains, including with CSS modules (using modules: true option with relevant proposal). Very highly implicit, but actually very nicely scalable.

Oh, and for other similar reasons, it's considered best practice to omit the extension. In particular, non-JS languages using other extensions (notably JSX, TypeScript, etc.) will try to load the wrong extension if you don't.

Also by providing this compatibility layer there will be absolutely no incentive to migrate to ES6.

False. By most accounts, and in my experience, ES modules are way easier to work with for initial startup once you have the toolchain set up, especially in terms of exporting and scaling. You also get better static analysis with eslint-plugin-import, and it's harder to screw up in larger projects. Consider the difference:

// CommonJS
const fsp = require("fs-promise")
const express = require("express")
const {map, flatten, matches: match} = require("lodash")

module.exports = function foo() {}
exports.bar = bar
function bar() {}
class Baz {
  // ...
}
// Exported for testing
exports.Baz = Baz

// ES6
import * as fsp from "fs-promise"
import express from "express"
import {map, flatten, matches as match} from "lodash"

export default function foo() {}
export function bar() {}
// Exported for testing
export class Baz {
  // ...
}

chyzwar commented Mar 13, 2017 edited

CommonJS and ES modules may import/require one another, without knowledge of the module type`

ES Module spec do not include CLJ import semantics. By adding interop you change ES Module spec and you have no control over how this will play in future. Maybe in ES 2020 module system in JS will change/extend and whole thing will blow?. Are you 100% sure that interop layer that you want to introduce do not have corner cases or bugs or security issues? Can you leverage perf optimisations available to ES Modules when running in interop mode?

My biggest issue is still new extension, not only it create fragmentation from day 1 but it would force every possible tool and beginners tutorial to add this details.
You will have sentences lines like that: node.js feels special and needs another extension for your JavaScript files.

Changing module syntax could happen incrementally in large code bases. Libraries switching module syntax would be semver-minor, as it would only require changing to a namespace import for existing native ES6 module consumers only (not existing Babel-transpiled code). Existing dynamic requires still work.

That not true and we know that. If there is no incentive people will never migrate to ES Module. You still have modules written is ES5 even when from node4 you could use a lot new language futures. Heck people are still using var just because they can.

ES modules may only be imported, CommonJS modules may only be required
This still create parallel universe but it is most sane and in long run it would bring hope that CLJ can be removed/depreciated.

I can tell you separating the two has already caused some fragmentation in the community. Consider lodash vs lodash-es, and other similar cases enabling Rollup and Webpack 2 to do its thing where you can't really do similar with transpiled CommonJS modules

lodash-es exist only because ES Modules are not widely supported. One day lodash will decide that ES modules are supported in node/browser and it only publish ES Module version. But thanks two node we will still have lodash-mjs

Changing module syntax would be semver-major, because you have to completely change how you load the modules. For one, you would no longer be able to dynamically import it until after the import() proposal gets implemented and shipped.

You have react.js version 15 and node 7, I do not see people screaming. Node can introduce two major version a year and libraries cannot do one ? I do not need dynamic import, I can temporary create adaptor to ES Module in CLJ module and use require for dynamic code loading.

Webpack 2 tells a different story here. It natively transpiles both ES modules and CommonJS modules, with full interop. Additionally, it mostly targets browsers.

Transpilation !== Interop. Webpack bundle JS for browser and in most cases module system is only used for compile optimisation not module loading itself. There no CLJ in browsers an the transpiration is only temporary because node do not support ES6.

False. By most accounts, and in my experience, ES modules are way easier to work with for initial startup once you have the toolchain set up, especially in terms of exporting and scaling. You also get better static analysis with eslint-plugin-import, and it's harder to screw up in larger projects. Consider the difference:

I know that ES Modules are better technical solution that why I am pushing so hard to get this right. Not everyone is this passionate to argue with random people in the internet about module system :p People like you and others in this thread will be early adopters, most node.js user do not care.
It will be extremely difficult to make business argument for ES Module migration if full interop is provided.

In my option node.js as platform should make right things easy and wrong things difficult. It should be easy to use ES Modules and CLJ should be discouraged.

Collaborator

bmeck commented Mar 13, 2017

My biggest issue is still new extension, not only it create fragmentation from day 1 but it would force every possible tool and beginners tutorial to add this details.
You will have sentences lines like that: node.js feels special and needs another extension for your JavaScript files.

Just use .mjs everywhere. See examples in browser spec also having .mjs

lodash-es exist only because ES Modules are not widely supported. One day lodash will decide that ES modules are supported in node/browser and it only publish ES Module version. But thanks two node we will still have lodash-mjs

See above.

You have react.js version 15 and node 7, I do not see people screaming. Node can introduce two major version a year and libraries cannot do one ? I do not need dynamic import, I can temporary create adaptor to ES Module in CLJ module and use require for dynamic code loading.

Unclear on this. I think the suggestion is when bumping you drop support for older versions which don't natively use ESM. This is not necessarily true. People do feature detection for things at runtime when available, by providing a safe upgrade path we can continue this pattern when using modules.

Transpilation !== Interop.

Agree, it transforms to a different system.

Webpack bundle JS for browser and in most cases module system is only used for compile optimisation not module loading itself.

Disagree. People use npm modules all over the place in front end code.

There no CLJ in browsers an the transpiration is only temporary because node do not support ES6.

Correct, Webpack et. al translate things into a CJS implementation and ship that into browsers even though browsers do not provide CJS implementations natively.

I know that ES Modules are better technical solution that why I am pushing so hard to get this right. Not everyone is this passionate to argue with random people in the internet about module system :p
It will be extremely difficult to make business argument for ES Module migration if full interop is provided.

I think all the people in these talks are very invested in this. It has been talked about for more than a year just on how to implement them. The idea of JS modules (called packages in ES4) has been talked about since the early 2000s.

In my option node.js as platform should make right things easy and wrong things difficult. It should be easy to use ES Modules and CLJ should be discouraged.

I agree, but don't see how .mjs would discourage use of ESM.

The idea of JS modules (called packages in ES4) has been talked about since the early 2000s.

For the ones wondering: ES4 Packages

If we followed the In Defense of .js proposal, we wouldn't need file extensions in require statements. Alternatively, what is happening with the ES spec changes that Allen Wirfs-Brock suggested may be considered by TC39? Would that not also remove the requirement for an alternative file extension?

I fail to see what is wrong with semver-major for CJS -> ESM. Any consumer of a library neeeds to reference the documentation for the library, and will see an announcement that a new semver-major needs to be imported rather than required. Changing module syntax can still happen incrementally in large code-bases.

The wider JavaScript community will use .js for ES-modules (because ESM is after all standard JavaScript); if Node uses .mjs, then .mjs files will be seen as ‘Node.js’ modules, separate from normal ‘JavaScript’ modules.

And we will have a permanent schism in the JavaScript community.

ljharb commented Mar 15, 2017

The wider JS community uses node-driven build tools, as does the entire npm ecosystem. What node does, everyone else will follow.

chyzwar commented Mar 15, 2017 edited

The wider JS community uses node-driven build tools, as does the entire npm ecosystem. What node does, everyone else will follow.

Internet run on PHP and jQuery. I can guarantee to you that nobody will reconfigure tools like: code editors, CI, asset pipelines, build systems. All that web-servers, http cache, firewalls, proxies, corporate intranet, CDNs and who knows what else will stay away from .mjs.

Just use .mjs everywhere. See examples in browser spec also having .mjs

Front-end people have absolutely no reason to use .mjs. For them there is no benefit. .mjs is only useful if you decide to write back-end in node.

I agree, but don't see how .mjs would discourage use of ESM.

With .mjs and full interop there will be no business justification to upgrade a code base. I work in large corporate environment and you will never persuade your business to spend few days just because of syntax. If there is major release of many libraries you will have argument for migration.

My other concerns about interop are still there:

  1. Maybe in ES 2020 module system in JS will change/extend and whole thing will blow?.
  2. Are you 100% sure that interop layer that you want to introduce do not have corner cases or bugs or security issues?
  3. Can you leverage perf optimisations available to ES Modules when running in interop mode?

@chrisveness,

I fail to see what is wrong with semver-major for CJS -> ESM. Any consumer of a library neeeds to reference the documentation for the library, and will see an announcement that a new semver-major needs to be imported rather than required.

This would prevent libraries from moving to ESM until the Node.js versions they're targeting all support it. Distributing ESM will require dropping support for current LTS and maintenance mode versions, or forking your distribution, e.g. library and library-cjs/library-es. At least with the .mjs approach support can be retained and distributions need not be forked.

cref commented Mar 16, 2017 edited

Here's an idea:
Starting with, say, Node v10, completely remove support for CommonJS and only support ES6 modules.
Tie it to a fresh NPM registry to which library authors should only publish ES6 packages.
NPM should check for compatibility between the Node.js version and the NPM registry being used.

You need cjs and only cjs? use Node 9-
You need es6 and only es6? use Node 10+

You want some potentially buggy, performance degrading interop layer?
Rely on some guy's library for that or build it yourself but don't count on Node.js offering this leaky abstraction out-of-the-box.

mgol commented Mar 16, 2017

@cref That's a perfect way to make no one in the world using Node 10 and, in the end, someone forking Node.js again with support for CommonJS back in.

This would be way worse than the Python 3 situation.

cref commented Mar 16, 2017

That's fine by me if they can accomplish a workable solution.
Did I say Node v10? I meant io.js v10 :P

mgol commented Mar 16, 2017

You can hardly call a "workable solution" something that would make the number of Node users dwindle away to something close to 0.

cref commented Mar 16, 2017 edited

What's in a name. We'll just call io.js node.js again once the number of node.js users has reached zero if that makes you feel better.

jkrems commented Mar 16, 2017

I don't think the io.js reference is relevant to this discussion. It's not like io.js made any big breaking changes. Most node 0.10 code worked fine on io.js (and still does on node master). What you're suggesting is stopping support for every single existing node program & library. This was never even considered for io.js (afaict at least). It would just create yet-another-unrelated-runtime that happens to reuse some code from node.

mgol commented Mar 16, 2017

@cref You misunderstood. If Node ever dropped support for the whole current npm package ecosystem, someone would fork it to a version that kept CommonJS support. People would never upgrade to a Node version that dropped CommonJS, they'd migrate to the fork that keeps the support. The newest Node version would get abandoned and - maybe - in the future the fork would be renamed back to Node.js.

So we'd end up exactly where we are now, the only difference would be a lot of problems with yet another fork in the meantime. What's the point?

cref commented Mar 16, 2017 edited

exactly, the name is not relevant, thanks for the confirmation. now where did I suggest stopping support? starting a new version !== ending support. ask microsoft. and why would you fork node for something you could just as easily solve with a library? is it because... you can't, because actually, require isn't conceptually the same as import and therefore might lead to a suboptimal compatibility layer when done as a library as opposed to a fork? better make the fork an official one then. hence, new version.

anyway, I'm done, got some forking to do! :-P

Public Service Announcement

TL;DR: Please remain respectful, and don't blindly reject others' viewpoints without taking the time to explain why they're wrong, not why you're right.


I feel this discussion has somehow degenerated to the point it's borderline off-topic and not exactly productive, to put it lightly. Not pointing fingers, but could we at least listen to those collaborators here, most of whom having been involved with Node.js for most of its history?

Oh, and if you feel someone's logic is off, please actually try to debunk it rather than just restating your point. Doing the former is proving them wrong, and it's usually something that you can do without even stating your side at all. Doing the latter will convince no one who doesn't already agree with you, and it'll make you look much less nice or respectful.

To summarize, please remain respectful, don't be dismissive, and do consider the other side. You might be surprised what you would learn with an open mind! 😉

@isiahmeadows I agree entirely; but I would like to point out (hopefully respectfully) that I think there is a sense that when it comes to .mjs, the CTC is rejecting others’ viewpoints without explaining why e.g. In Defense... or TC39 spec changes wouldn’t work; this sense is only reinforced by an attitude of “what node does, everyone else will follow”.

cref commented Mar 17, 2017

Good point @isiahmeadows . All I'm doing here is offer some outside-of-the-box alternatives. IMHO, what's the most important aspect about es6 modules is the fact that it's a format optimized for both filesystem-based and web-based usage. Adding support to Node.js by introducing yet another file extension that's implicitly being searched for when fetching a module undermines the modular aspect for web-based platforms and would force web developers to still use additional build tools.

Let's just wait until the Loader spec is finalized and then see how best to support loading Node.js-flavoured CommonJS through import.

ljharb commented Mar 17, 2017

Web developers already have to use additional build tools to get anything useful done - that's the new reality.

chyzwar commented Mar 18, 2017 edited

Web developers already have to use additional build tools to get anything useful done - that's the new reality.

Please do not conflate back-end and front-end development. There is number of language that do not require any additional tooling: go, Java, PHP, Python etc.

Currently in node.js we barely have any editor support: auto completion, debugging, profiling. Node is now planing to introduce forced marriage between ES6 Modules and CLJ and make whole thing even worse.

Can someone designing interop can confirm following:

import {join} from 'path'

This will load only join or whole path module?

import {sharedApi} from 'https://cdn.syndication.twimg.com/api.js'
  1. Can I load a module from web in node.js ?
const sharedApi = require('https://cdn.syndication.twimg.com/api.js');
  1. If full interop is provided then this should work as well?
const {something} = require('esModule')
const something =  require('esModule').something
  1. Named exports will be treated as object properties?
import myModule from './myModule.js'
  1. If I providing relative path do I still need to use .mjs extension?
  2. If whole module only use ES Module can I only use .js?
export default const something = {}
exports.default =  {}
  1. What if module provide both types of exports ?
  2. With module export would take priority?
Collaborator

bmeck commented Mar 18, 2017

@chyzwar please read #39 for answers regarding most of these.

import {join} from 'path'

  1. No, it parsing the full source of "path" according to the specification. Tree shaking / Dead code elimination is done by tooling outside of ECMA262 (the JS spec). JS spec does not include such a feature, it is purely tooling.
  2. Named imports from CJS are not possible for a variety of reasons.

import {sharedApi} from 'https://cdn.syndication.twimg.com/api.js'

Won't work. Only file:// is supported until security is fully worked out for networking. Other protocols that work in the browser like data:// are not supported either for initial implementation.

const sharedApi = require('https://cdn.syndication.twimg.com/api.js');
const {something} = require('esModule')
const something =  require('esModule').something
  1. Can't load off network still.
  2. require is tentatively unable to load ESM since you can use import() everywhere.
  3. Yes, the spec has a mandated type for what ESM look like when turned into objects. This exposes getters for each variable.

If I providing relative path do I still need to use .mjs extension?

Open issue : #51 , current judgement is that you do not, but some people on browser side are concerned about that. Note, web browser support .mjs so it is not a problem if it is included optionally.

Searching so called "bare" URLs (things in node_modules/) are expected to always have search behavior.

If whole module only use ES Module can I only use .js?

Not planned. Added complexity and does not focus on a future where there is a uniform way to produce code for ESM. This affects both browser vendors and node; both are well aware of the situation.

What if module provide both types of exports ?

This is not possible according to the JS specification due to:

  1. immutability constraints
  2. removal of magic CJS variables

With module export would take priority?

Modules only have 1 type either CJS or ESM. Therefore there is no priority since there is no competition.

I would note that interoperability does not mean 100% compatible in both directions for our goals. Node is not going to limit its features to web browser APIs, and web browsers will not allow the same features as Node. import falls on the line of this as it determines how features get loaded, I doubt there will ever be a case where import "fs" works in browsers truly, and I doubt that node would load untrusted scripts from 3rd parties as easily as browsers since it needs more direct access to the system.

Collaborator

bmeck commented Mar 18, 2017

The implementation of ESM has started, I am going to close this. If you have specific concerns about pathing or a problem with the file extension, please open a new issue. General concerns have been covered here in depth.

bmeck closed this Mar 18, 2017

isiahmeadows commented Mar 19, 2017 edited

@chrisveness

the CTC is rejecting others’ viewpoints without explaining why e.g. In Defense... or TC39 spec changes wouldn’t work

I am also aware of that, but they have already previously addressed nearly every viewpoint presented here in the last week viewpoints earlier in this issue, #3, #39, and other related issues. It would've been definitely more helpful and productive if more of them had linked to previous discussion to justify their dismissal, but I do agree with their unwillingness to accept these arguments which have been covered multiple times already. (I've been passively involved and watching this closely since about a month after #3 was filed early last year, and I haven't seen much actually new come up in the last few months.)

But on that note, this issue is closed, so I'm going to drop the issue now.

Hi! Where can fe follow the development of ESM?

shelby3 commented May 19, 2017 edited

Lol, Node.js forking itself away from the ECMAScript ecosystem with a proprietary file extension total order on the universe and conflated interroropt of module specifications. Purrrfect.

ljharb commented May 19, 2017

@shelby3 that comment isn't productive or accurate. Browsers ignore file extensions entirely; ECMAScript is a spec, not an ecosystem; people use .jsx all over the place just fine.

Again, the reality is this: a file extension is supposed to mean only one thing: "here's how to parse this file". The creation of a second parsing goal in the language - Module - inevitably ensured the creation of a new file extension to indicate "parse this file as a Module". .js is already irrevocably used as "parse this file as a Script" everywhere but in browsers (which use <script type="module"> to determine that it's a Module) and browsers are informed by the webserver who chose that tag, which is in turn typically informed by the file extension.

Feel free to suggest a better alternative, but the only possibility for avoiding a second file extension vanished when the language spec shipped a second parsing goal. That may be unpleasant for people - but that doesn't change the reality.

SEAPUNK commented May 19, 2017 edited

Just curious: Does the browser require the returned MIME type (via Content-Type header) to be something specific for it to execute the file as Javascript? I'd imagine that CDNs and other web servers, until they add a mapping for the .jsm (or .mjs, or whatever it'll be) extension, would return the .jsm JS module files as application/octet-stream.

I like that Node.js has become "everywhere but in browsers".

ljharb commented May 20, 2017

@chrisveness do you know of non-node non-browser tools that take no out-of-band instructions (including "the only paths it accepts are JS files") and know something is a JS file by something besides the .js extension?

shelby3 commented May 20, 2017 edited by MylesBorins

@ljharb wrote:

The wider JS community uses node-driven build tools, as does the entire npm ecosystem. What node does, everyone else will follow.

Pride cometh before thy falleth.

Other things (e.g. TypeScript) are gaining momentum and eventually they might decide it is a lower priority to keep pace with more Node.js aberrations, especially after this epic fiasco causes more people fork away from Node.js.

but the only possibility for avoiding a second file extension vanished when the language spec shipped a second parsing goal

TypeScript determines what is a module by whether it contains any top-level export annotations.

that comment isn't productive or accurate.

As @chyzwar, @cref (and to some extent @chrisveness) have explained (which I consider carrying forward the original points I made in this thread last year), trying to backport require to support both ESM and CJS is a “can of worms”. A clean separation (eventually deprecating require if ever CJS use diminishes sufficiently) provides more degrees-of-freedom and does not muck with the wider ecosystem of tools by introducing a proprietary file extension that further fractures the JavaScript ecosystem (see the point about complexity budget near end of this post). I had even proposed as a compromise a hokey heuristic use module as way to embed the ESM attribute (as opposed to a file extension) but afair, before that I was preferring instead to not overload require for the different module formats (which was thus congruent with and prior art for what these 2 or 3 guys seem to be proposing). Periodically we have to reset away from maintaining legacy complexity, else the rigor mortis morass devolves.

Also unnecessarily providing two ways to do the same thing (i.e. import or require ESM modules) which have to be maintained into the indefinite future, violates a fundamental principle of good design, as well is incongruent with maximizing readability of source code— which is a very high priority in the open source era.

You all the gatekeepers who are in control of the intended echo chamber, groupthink here, want to make it very easy for old code to be switched over to ESM by changing the module file names instead of changing all the require keywords to import in the source code. Yet @chyzwar has explained that this presumed convenience is not even that clearly to work out, and the potential complexity for clusterfucking into the future is unpredictable. Even though he was talking about type inference, I think the general gist of a wiser elder from the C++ design process (in his 60s and I am in my early 50s) @skaller’s thesis is applicable to trying to make many small special case implicit conveniences as Node is doing in this case and ending up with a jumbled, rigor mortis mess of complexity eventually:

@skaller commented on Oct 10, 2016:

Implicit quantifiers do NOT make code easier to read. Its like most “simplifications”,
they seem like a good idea at the time, they work well for simple cases,
but its a disaster precisely in the cases where you think it would serve best,
the complex cases.

Its a known fact that human information transmission requires redundancy.
Perhaps less for smarter people, or people familiar with the domain,
more for others. But for language, psychologists have actually MEASURED
the amount of redundancy that gives the best comprehension, its somewhere
between 15 and 30% I believe (in linear text).

Computer programming languages may require different redundancy.
but infering types everywhere clearly defeats the whole point of having
type annotations: to provide redundant information for validation and
confirmation.

Regarding exceeding the complexity budget, please see the comment quoted (not my comment but rather some very smart programmers who visit 160 IQ genius Eric Raymond’s blog) near the end of this post about how horrendous JavaScript’s chaos has become.

Edit: I had mentioned this complexity overload in my first post last year, and even pleaded to be respectful of the users like myself. The complexity is so bad that no one without days to kill can even wrap their mind around all the epic volumes of arcane discussion that a reader is slammed with. K.I.S.S. is important. Transitioning towards one module standard that works every where is the sane direction that reduces complexity over time.

edit:

edited by @MylesBorins. replaced unfortunate choice of words with "echo chamber". I believe that keeps original intent in tact

ljharb commented May 20, 2017 edited by MylesBorins

@shelby3 please don't use inappropriate terms like "circle-jerk" - https://github.com/nodejs/TSC/blob/master/CODE_OF_CONDUCT.md may be helpful.

(I'll be happy to delete this comment after/if you edit your post)

shelby3 commented May 20, 2017 edited

@ljharb, the COC says you should not accuse my joke of being not productive (passive aggressive political control methodology) which you (and your cohort gatekeeper) also did to me last year, when it was a concise way of pointing out my technical objections without getting into another fight with you gatekeepers. Since you responded that way, I responded in kind. I tried to keep it light and humorous, but you want to fight because you are sure you are correct and everyone who disagrees with you is wrong.

Edit: last year I even I tried to convince the politik to leave the ad hominem aside, but was unsuccessful.

Sent in a private msg just now:

Reviewed that ES6 Modules and Node.js thread. And found some links to where those two gatekeepers took the purely technical discussion off into ad hominen.

Without the perspective of the history of that thread, one might conclude incorrectly about my latest comments. Rereeading that thread convinced me that I was not acting inappropriately. And I often cringe when I re-read some of the stuff I wrote in the past, so if I did not cringe when re-reading that thread, then I think it is reliable indication that I did not act inappropriately.

Ok but, anyone knows the status of the work on ES modules?

Member

benjamingr commented May 20, 2017

@shelby3 it sounds like you have some interesting arguments to make and discussion was conductive until 2-3 messages ago.

I realize that strong opinions sometimes lead to strong language or terminology but Node really cares about being a nice environment and it would be a shame if we missed your insights into the problem (to be clear, completely serious here) because of the way discussion is had.

I realize it's a little unfair to ask you to have discussion in the particular way we're asking and that you're entirely within your right to refuse - but I would much rather prefer you stick around :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment