Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dependency Extraction Webpack Plugin impaired DevX #35630

Open
tomalec opened this issue Oct 14, 2021 · 32 comments
Open

Dependency Extraction Webpack Plugin impaired DevX #35630

tomalec opened this issue Oct 14, 2021 · 32 comments
Labels
Developer Experience Ideas about improving block and theme developer experience [Tool] Dependency Extraction Webpack Plugin /packages/dependency-extraction-webpack-plugin [Type] Enhancement A suggestion for improvement.

Comments

@tomalec
Copy link
Contributor

tomalec commented Oct 14, 2021

Forgive me for the generic title, but I'd like to tackle many interconnected issues related to Dependency Extraction Webpack Plugin (DEWP) and its DevX, as I believe we need a holistic view to solve those.

TL;DR
Plugin developers have no control or even introspection over the extracted dependencies. Not only the versions but also the packages being extracted. Which heavily decreases the quality of our products and the maintenance cost.

I'm trying to gather here the problems and ideas. So, we could refer to this issue from GH issues and PRs with potential solutions (like #35106 (comment)), without losing the overview of the problem.

Context

I'm a WooCommerce(WC) plugin developer, so the perspective below could be WC-oriented. However, I believe the problems stated here are not unique to WooCommerce or my plugin and apply to any other WordPress plugin. WooCommerce adds just another layer of dependencies.

Dependency Extraction Webpack Plugin (DEWP)

AFAIK, the main goal and reason why we use DEWP are to avoid delivering duplicated JS packages from plugins to a single WordPress (WP) site. It reduces the network and CPU cost for users and saves us from a number of problems related to the packages which have a side effect, which may collide when imported/used twice.

I see dependency extraction as a nice way to address that, also I believe that customers' costs and experience should be the top priority when looking for a solution.

But the way our tool works introduces a number of struggles for developers. IMHO, the impact is severve. I noticed a number of people working on different plugins and products discussing their struggles with DEWP. To somewhat evaluate the impact in the plugin I work for, I marked all issues that are affected by this problem (However, it doesn't include everyday frustration)

Where we are at?

Let’s draw a picture of what dependencies and problems we face while developing a plugin.

Naturally, we have (let’s call them “main”) dependencies: WordPress and WooCommerce. We have a strict requirement – driven by the actual merchant/customer expectation – to support at least a few versions behind, not only the latest, as updating the stack is a cost and burden for our merchants. However, given the finite resources we have, we’d rather balance it not to support too many combinations with legacy code. We have a defined process to track updates of those, and a tool (wc-compat-checker) to support the PHP side of it. But it's still manual labor, and we do not have much tooling to support the person doing a compatibility check on the JS side.

We have “granular” dependencies – individual npm packages stated in package.json. There are also composer dependencies, but I’m not sure if they are also that problematic. I guess, there is no customer-driven requirement for those. I doubt we have customer requests like “Hey, I’d like AutomateWoo to work with my @woocommerce/[email protected]-based store”. However, we do have some constraint that comes from using many dependencies – they need to be cross-compatible with each other and with the WordPress & WooCommerce being used.

Then we have dev dependencies, for which the merchant should not care about at all. Unfortunately, they are also tied to WP/WC versions being in use.

Problems

The above creates three fundamental problems:

  1. If every plugin would bundle all their dependencies, there would be a lot of duplicates, pushing the unnecessarily big cost of the transfer, parsing, and execution to the merchant, and their customers.
  2. The number of dependencies multiplied by the range of WP & WC versions supported is a maintenance cost for developers to track, update, resolve compatibility conflicts.
  3. Sometimes, the versions being used, or supported are not easy to find and match. Like @woocommerce/components version being used for respective WC version.

To solve/mitigate the first problem, there are @wordpress/ and @woocommerce/dependency-extraction-webpack-plugin. However, the way it works makes the whole development quite indeterministic: which packages are extracted, which main or granular dependency is actually used in runtime? It’s being extracted and used blindly, regardless of versions, without even reporting what was extracted, and what is assured to have a specific version. That created a bunch of other problems:

  1. As a plugin developer, I can’t assert/specify the version of my granular dependency, like @woocommerce/components. Unless, I carefully, manually, and deeply curated and extracted all the granular dependency trees across all (minor and patch) versions of main dependencies. Then all I get is still a range of versions.
    This is very time-consuming labor, that needs to be done very often. When adding a dependency, when updating anything, when debugging an issue, when checking compatibility. Currently, there is no tool to support that or even a specific list of resources to track.
  2. As a plugin developer, I don't even know what dependencies are extracted, as the list is maintained in the package repo, and not reported while bundling.
    That may result in unexpected behavior and error. Checking that again requires even more manual effort. As it requires digging into the source code of DEWP at a given version and manually comparing packages list.
  3. As a merchant, support engineer, or dev investigating the issue, I cannot easily, precisely tell what versions of packages are being used for the given setup/instance.
    That makes reproducing the reported problems harder and more time-consuming, eventually affecting customer experience.
  4. Running automated unit tests could give a false impression, that the speced behavior is covered, even though it may not be true, as the unit tests run against granular dependency versions specified in the local package.json while the one run by the customer is totally different.
  5. In our CI we would have to run our unit and E2E tests for all the combinations. That would cost us CI time and again is hard to specify, and pretty volatile over time. Moreover, we cannot even state which exact combination is run, to be able to assert at least one.

In a Slack discussion, @nerrad suggested implementing the L-2 policy, and supporting only two versions behind, then supporting the lowest versions available. This naturally limits the ranges and number of combinations but does not tackle the problems themselves. Plus, may fail if any main dependency decreases the version of its dependency, or introduce backward-incompatible change.

Cost of status quo

In my opinion, the above impair not only DevX but also innovation and the quality of our products.

Innovation

  • The delay of many versions between releasing the features and being able to use them slows down the adoption.
  • The dependant project cannot early-adopt the new features, and cannot upstream local experiments instantly to the used components

Quality

  • If the dependants cannot switch to the latest version, they do not use it => they do not test it on time. Two versions later the feedback on some fundamental issues may be too late.
  • the bugs become stale and becomes the “features” we need to support for backward compatibility
  • If the dependency version becomes too hard to manage, and a dev becomes too frustrated to work it around, they may include it in the local package. Forcing the customer to pay for the DevX problem we created ourselves.
  • If the dependant project cannot use the latest, fixed version, they need to incorporate workarounds, which may have side effects, own bugs, require maintenance, which reduces the resources for other areas of the project.
  • … workarounds usually take space and CPU time which again is pushing the cost on the customer.

Ways to go

I think when looking for a solution for the problems stated above, we could take a few (non excluding) strategies:

  1. Replace DEWP with some other solution
  2. Change the way DEWP works
  3. Improve developer experience and quality assurance with the DEWP as it is, with tooling around and minor tweaks.

Personally, I'd start from the latter, as it's the cheapest to start with (in time, effort, and chaos it'd generate).

//cc all folks who I noticed discussing related ideas: @ecgan, @roo2, @scinos, @jsnajdr, @fullofcaffeine, @noahtallen, @sirreal, @gziolo, @mcsf, @nerrad


Solutions

I don’t have any precise well-established solution for the above. I’d rather start a discussion about those. But here are few ideas I managed to gather or come to my mind. I put them in separate comments, for easier reference.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 14, 2021

L-2 policy

L-2/L-n policy means that the plugin should support only the latest n versions behind the current one.
In regards to WordPress & WooCommerce for sure would help to limit the number of combinations we test and check. I'd add one additional rule to that, to be able not to limit the innovation, use the root fixes, instead of working them around over and over again. I’d say we should make it so: “We support version of main dependency no older than n, but occasionally we may still use the newer versions of granular dependencies”

It's to be implemented on the plugin level only.
It helps with Problems 2-6 but puts pressure on merchants to frequently update, so we need to balance the n. Plus, it does not solve the problem at all, just limits its range.

I think it's a nice policy to start with, but we should not stop here, as we still face all the DevX frustration even with n=2.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 14, 2021

Build/automate WP, WC, plugin version maps

I think that all the data is there (somewhere) we just need a nice and automated way to gather it all in a single place. As mentioned in the comment by @jsnajdr

We could be maintaining machine-readable information about which WordPress version ships which versions of the Gutenberg plugin and the NPM libraries

I’d only add to make it machine and human-readable, and for WooCommerce as well.

Conceptually it’s gathering all package.jsons for respective WP, WC, DEWP, and plugin versions and putting them in a nice table. I’d also add there the WP versions supported by a specific WC version.

Thanks to that a developer, support engineer, merchant, further tools could inspect what versions are used where and decide upon that.
It will help in

  • debugging issues, to know in which version of a package should we look for a bug.
  • support engineers communicating with customers, as one could easily check that a given plugin uses a granular dependency that's far behind the WP they use, which may be a reason for problems
  • developing a plugin, as a developer could check:
    • which packages were actually extracted, where they should put additional caution
    • what granular dependency versions range is supported in the range of WP & WC they support, so know which features could be used, and which should be worked around
    • versions in the upcoming WP/WC release to be able to anticipate it and fix compatibility problems before they arrive to support engineers
  • testing a plugin, as we could set up automated tests for the combinations that would be actually in use
  • implementing further solutions for the dependency problems, as the build- and run-time code could take and process that information, to serve the code to be imported/executed.
  • gathering the data about the scale of the version problems, as we will finally see if/how far the ranges do not match. We could even track/report that data on production builds.

It solves 3, 5, 6; helps with 2, 4.
Requires action/changes in WordPress, WooCommerce, or DEWP repos.

I think it's something we should start with, as it would give us an insight and data to reason about the problem.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 14, 2021

Make Dependency Extraction Plugin handle dependency versions

I believe we have all the data in place to solve the problem, assuming we already have a way to get granular dependencies’ versions for a given WordPress version. The local package.json gives the versions for dependencies in us. WP version (range) is also given in the repo.

I’m not very experienced with Webpack. But to me, the feature that DEWP brings is similar to what native Import maps do: “Check what dependencies are already available in WP/WC and map those imports to external modules” (instead of adding and looking them up in the local bundle).

What’s cool about native import maps (besides the fact, it’s native, already available for free in Chromium, and does not require us to complicate our tools and stack) is that they seem to solve the problem of multiple versions problem – with scopes. So if someday we’ll switch to native ESM, we could use a single map for all plugins, without a need for each plugin adding DEWP to their tool stack.

I’ll use import maps syntax as I believe it’s clear and declarative enough to express the behavior we’d like to have:

Consider the given WooCommerce version uses

"@woocommerce/components": "5.1.2",
"@woocommerce/currency": "3.1.0",
"@woocommerce/number": "1.2.3",

Then the import map for a plugin that would like to use shared dependencies would look like

"imports": {
    "@woocommerce/components": "/wp-content/…/woocommerce-admin/…/components.js",
    "@woocommerce/currency": "/wp-content/…/woocommerce-admin/…/currency.js",
    "@woocommerce/number": "/wp-content/…/woocommerce-admin/…/number.js",
},

That’s what we have today. And AFAIK, that’s where DEWP functionality ends in terms of versions. But hopefully, we can add a bit of checking logic there.

Overlap

If our plugin uses

"@woocommerce/components": "^5.1.1",
"@woocommerce/number": ">=1.1.1 <1.2.0",
"fast-json-patch": "^3.0.0",

Then all the shared dependencies are there, so the bundle will include only fast-json-patch from gla/node_modules/, and the @woo… dependencies will be mapped as above.

Newer version

If our plugin uses

"@woocommerce/components": "^5.1.1",
"@woocommerce/number": "^2.0.0",
"fast-json-patch": "^3.0.0",

@woo…/components version matches, so will be removed from the bundle. But the …/number does not, so will be added to the bundle together with fast-json-patch, and the import map will be modified to look like this:

"imports": {
    "@woocommerce/components": "/wp-content/…/woocommerce-admin/…/components.js",
    "@woocommerce/currency": "/wp-content/…/woocommerce-admin/…/currency.js",
    "@woocommerce/number": "/wp-content/…/woocommerce-admin/…/number.js",
},
"scopes": {
    "/wp-content/plugins/gla/": {
        "@woocommerce/number": "/wp-content/…/gla/…/number.js",
    },
},

(So the imports originating from …/gla/ in GLA bundle will point to GLA-specific version)

Native import maps are resolved on the run-time, so they could use the WP/WC dependency maps from the currently running setup.
With Webpack, we need to do it on the build-time, so we would have to consider the ranges of granular dependency versions from the range of main dependencies. But the logic stays the same.

It solves 1, 2, 4, 5, 7, 8.
Requires action/changes in WordPress and DEWP repos.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 14, 2021

Add runtime import map shim/resolver

This one could be the trickiest to implement, but if a given WordPress version knows its own granular dependency versions and receive the plugin's map, then theoretically it knows whether the versions match. If so it would do what it does today, and return the import from the WP bundle. If not could point back to the plugins bundle. So the "extended" bundle would be requested only if the currently running WP environment does not have a matching dependency.

However, I don't know enough on how WP handles all the scripts, to propose something more precise, or judge how doable it is.

It may solve all the problems, but would require a lot of changes across the stack.

@gziolo gziolo added [Tool] Dependency Extraction Webpack Plugin /packages/dependency-extraction-webpack-plugin [Tool] WP Scripts /packages/scripts Developer Experience Ideas about improving block and theme developer experience labels Oct 14, 2021
@scinos
Copy link
Contributor

scinos commented Oct 14, 2021

Just spitballing some ideas:


Sounds like Webpack Module Federation could help here. The official documentation says

Many applications share a common components library which could be built as a container with each component exposed. Each application consumes components from the components library container. Changes to the components library can be separately deployed without the need to re-deploy all applications. The application automatically uses the up-to-date version of the components library.

If we replace application -> plugin and components -> main dependencies, it really sound similar to the problem exposed above.


Maybe we can replace DEWP with Webpack DLLs. Maybe we can publish NPM packages with the DLLs for specific WP versions (eg: @wordpress/wp-dll). This package will contain a pre-built DLL that plugin authors must include in their Webpack compilation. It will take care of extract the main dependencies.

This package could also provide a set of peerDependencies with the exact version of the expected libraries (i.e like your "Conceptually it’s gathering all package.jsons for respective WP, WC, DEWP, and plugin versions and putting them in a nice table.", but in JSON format).

So MyPlugin depends on @wordpress/[email protected]. Via peerDependencies, that forces my plugin to also depend on specific versions of other dependencies (eg: @woocommerce/[email protected]). This allows me to run tests with the exact versions that will be used in prod. It also includes a DLL config that I need to plug into my Webpack config, so it knows which packages to extract.

Although that would mean I need to maintain several "lines" of MyPlugin: one compatible with @woocommerce/[email protected], another one compatible with @woocommerce/[email protected]... So maybe this is not as good as it sounded in my head when I started this comment :)

@fabiankaegy
Copy link
Member

I also have one thing that I think causes a lot of pain points when it comes to working with the DEWP. And that is the inability to tree shake. If you use a singular function from lodash for example it adds the dependency lodash and therefore the entire library gets loaded. Which is a lot of overhead for a small simple function. And since the request is an external running something like the WebpackBundleAnalyzer won't know about it and therefore it can be hard to spot the actual cost of importing a package.

Regarding your comments, I actually like the idea of a runtime import map shim/resolver quite a lot. Core currently doesn't ship with multiple versions of the packages and I'm not sure whether it would be wise to do so. Of course, you could always bundle all your dependencies with your plugin but as you mentioned that leads to the same code being imported by multiple plugins and therefore a lot of overhead. So the idea of only loading a bundle if the version isn't a match is very intriguing. I also have no clue however how feasible it would actually be to implement something like it.

@gziolo
Copy link
Member

gziolo commented Oct 18, 2021

There are also some related issues to WordPress dependencies and their relation with npm packages published:

In practice, it is more complex because sites with the Gutenberg plugin installed will have different versions of the same script dependencies. In the case of the Gutenberg plugin, it not only changes every two weeks, but it is also acceptable to remove experimental and unstable APIs after 3 plugin releases (a few weeks). It's discouraged to use experimental/unstable APIs but we don't have control over what plugins use. The other challenge is that it isn't mandatory to publish to npm the version of WP packages that the Gutenberg plugin uses. In addition to that, during the WP major release cycle, we only cherry-pick bug fixes from Gutenberg releases and publish them to npm. Whatever this discussion lands on, it might only work with WordPress core, but is close to impossible to apply the same techniques based on the npm packages to sites using the Gutenberg plugin.

@jsnajdr
Copy link
Member

jsnajdr commented Oct 18, 2021

I also have one thing that I think causes a lot of pain points when it comes to working with the DEWP. And that is the inability to tree shake. If you use a singular function from lodash for example it adds the dependency lodash and therefore the entire library gets loaded.

@fabiankaegy If the lodash package is externalized, it means that the compiled JS is going to run on a WordPress page and will be loaded with wp_enqueue_script() that declares a lodash dependency. In other words, lodash comes from the WordPress platform which provides it only as the entire library, not as individual functions. There's no opportunity for any tree shaking.

Tree shaking is available only for packages that are bundled into the compiled JS. That's the opposite of externalization. So, if you really want to bundle the one or two Lodash functions that your script is using, the solution is to opt-out from lodash externalization for that script.

@fabiankaegy
Copy link
Member

I also have one thing that I think causes a lot of pain points when it comes to working with the DEWP. And that is the inability to tree shake. If you use a singular function from lodash for example it adds the dependency lodash and therefore the entire library gets loaded.

@fabiankaegy If the lodash package is externalized, it means that the compiled JS is going to run on a WordPress page and will be loaded with wp_enqueue_script() that declares a lodash dependency. In other words, lodash comes from the WordPress platform which provides it only as the entire library, not as individual functions. There's no opportunity for any tree shaking.

Tree shaking is available only for packages that are bundled into the compiled JS. That's the opposite of externalization. So, if you really want to bundle the one or two Lodash functions that your script is using, the solution is to opt-out from lodash externalization for that script.

@jsnajdr Yeah I do understand the technical reason for it :) I only wanted to raise it here because I believe that since it happens behind the scenes it is not as obvious while you are developing and therefore a pitfall that you can very easily fall into.

And so either we can take a look at better ways of reporting the impact that your externalized imports will have on the end-user, or find a technical solution that would allow for more granular imports (I know this would be a very tricky and maybe impossible goal :))

So maybe it just is an addition to the CLI output that you get to see with the file size of an estimate of the externalized packages bundle size or some sort of reporting :)

@jsnajdr
Copy link
Member

jsnajdr commented Oct 18, 2021

And so either we can take a look at better ways of reporting the impact that your externalized imports will have on the end-user

This kind of reporting is most useful if the developer can do anything about the reported issues, but here I'm afraid they can't do anything. It's a shortcoming of the WordPress platform. So, my plugin uses the @wordpress/components package, and the tool tells me that this package is a big monolithic blob -- there is nothing actionable about this report.

find a technical solution that would allow for more granular imports

The only viable solution is to create small modular packages instead of big monolithic ones. I'm afraid that anything else, like true tree shaking, is at odds with having a plugin architecture and modularity, where multiple blocks and plugins live together on the same page and share stuff.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 18, 2021

In practice, it is more complex because sites with the Gutenberg plugin installed will have different versions of the same script dependencies. In the case of the Gutenberg plugin, it not only changes every two weeks, but it is also acceptable to remove experimental and unstable APIs after 3 plugin releases (a few weeks). […] The other challenge is that it isn't mandatory to publish to npm the version of WP packages that the Gutenberg plugin uses. In addition to that, during the WP major release cycle, we only cherry-pick bug fixes from Gutenberg releases and publish them to npm.

This kind of reporting is most useful if the developer can do anything about the reported issues, but here I'm afraid they can't do anything. It's a shortcoming of the WordPress platform. So, my plugin uses the @wordpress/components package, and the tool tells me that this package is a big monolithic blob -- there is nothing actionable about this report.

In my opinion, it even adds more importance to this issue. As if the README of DEWP would state:

This allows JavaScript bundles produced by webpack to leverage WordPress style dependency sharing without an error-prone process of manually maintaining a dependency list. replacing your dependency with an unknown blob of code coming from an unknown source, at an unknown version, leaving no guarantee on the API. Making end result error-prone and indeterministic, hard to manually or automatically inspect.

Then I as a plugin developer would do everything to avoid it as much as I can. To be able to assert not only the quality of my product, security, and integrity of the data.


Whatever this discussion lands on, it might only work with WordPress core, but is close to impossible to apply the same techniques based on the npm packages to sites using the Gutenberg plugin.

Having it solved just for WordPress Core, is already a step forward. I agree that comparing npm packages when there are no npm packages doesn't make sense. So "it's impossible to solve it by npm packages techniques only". That's why I proposed to use import maps that do not involve NPM at all.

Plus, I hope that the way Gutenberg processes a release and the way it uses/publishes the packages is not something set in stone, but something we can still discuss and potentially improve. If we agree it is suboptimal and impairs the quality of the ecosystem.


I believe it's not an impossible thing to solve.

I understand that with the way WordPress platform works, many plugins may load/overwrite scripts. But that still does not block us from making a solution that would allow a plugin to get some assurance over its dependencies.

We know the current DEWP as it is, does not solve that. Now we need to find out whether we can improve it or need something more.

WordPress is not a unique platform when it comes to the problem of delivering a set of shared dependencies and allowing the individual parties (plugins) to add more and overwrite some.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 18, 2021

This kind of reporting is most useful if the developer can do anything about the reported issues, but here I'm afraid they can't do anything. It's a shortcoming of the WordPress platform. So, my plugin uses the @wordpress/components package, and the tool tells me that this package is a big monolithic blob -- there is nothing actionable about this report.

I think that kind of reporting still have some value ;) having it clearly stated, that "Hey, by using DEWP, your @wordpress/components could become anything" Could help the developer make a conscious decision whether to use it or not. Personally, it took me ~6 months of using DEWP to realize that the package I externalized is not the same I expected.


Speaking of tree shaking, I think it's another problem to solve, on a deeper level of complexity and ROI. So far we are struggling with a solution to de-duplicate packages, package-shake. Tree-shaking of what's delivered on WP and what my plugin needs, was out of my scope.

I have a gut feeling that first, we need to have assurance that the plugin we actually import is the package we expected to import. Before we start to shake the unwanted bits from those packages.

However, I see that while improving import management, we could provide more insights like

So maybe it just is an addition to the CLI output that you get to see with the file size of an estimate of the externalized packages bundle size or some sort of reporting :)

So the developer themself could decide whether it's worth taking the risk of externalizing a big library if it could be tree-shaken to something small enough to be bundled locally.


To tackle tree-shaking of dependencies already delivered by the platform according to plugins' usage, would probably require a build phase after the plugin is activated, which I believe is a major architectural shift in the ecosystem.

@scinos
Copy link
Contributor

scinos commented Oct 18, 2021

The other challenge is that it isn't mandatory to publish to npm the version of WP packages that the Gutenberg plugin uses

Could you elaborate on that? Does that mean that if I use @wordpress/[email protected] in my plugin, DEWP can replace it by a version that is not published anywhere?

@tomalec
Copy link
Contributor Author

tomalec commented Oct 18, 2021

Does that mean that if I use @wordpress/[email protected] in my plugin, DEWP can replace it by a version that is not published anywhere?

AFAIK, yes. At least for WooCommerce woocommerce/woocommerce-admin#7628

But I'd appreciate a more elaborate explanation too. I'm curious what's the rationale behind it, and what value it brings? From a plugin developer perspective, I see a lot of downsides.

@gziolo
Copy link
Member

gziolo commented Oct 19, 2021

The other challenge is that it isn't mandatory to publish to npm the version of WP packages that the Gutenberg plugin uses
Could you elaborate on that? Does that mean that if I use @wordpress/[email protected] in my plugin, DEWP can replace it by a version that is not published anywhere?

The webpack plugin (DEWP) doesn't replace import statements with different versions of the package in the build process controlled with webpack. Instead, it replaces import statements with references to wp.* globals:

import { ComponentA } from '@wordpress/components';

becomes something close to:

const { ComponentA } = wp.components;

So as long as the same public API is present in WordPress core through the scripts registered at wp-components handle exposing wp.components global then everything should be fine. If a new API gets introduced between versions then the most popular way to handle it is to run conditionally code until it's present in all versions of WP that the plugin targets.

 return ComponentA ? <ComponentA /> : null;

The same strategy is used in PHP code that plugins write for WordPress.

@nerrad
Copy link
Contributor

nerrad commented Oct 19, 2021

WordPress is not a unique platform when it comes to the problem of delivering a set of shared dependencies and allowing the individual parties (plugins) to add more and overwrite some.

This really stood out to me. I'm inferring from your statement that you have some examples of other platforms in mind (on a similar scale as WordPress/Gutenberg) that have wrestled with this problem. If so, what do these other platforms do to solve the problem? Are there things we could learn from those examples?

@jsnajdr
Copy link
Member

jsnajdr commented Oct 19, 2021

Hey, by using DEWP, your @wordpress/components could become anything"

I sense there is some deep misunderstanding about the purpose of DEWP and what it means to "import @wordpress/components".

When using it in a WordPress plugin, the @wordpress/components import doesn't really import anything from node_modules/@wordpress/components. That package doesn't even need to be there. It's convenient to have it there maybe for typechecking and unit testing, but then it's something like a mock implementation, not a real one.

The meaning of the import is that the code is running inside some environment that already contains the components package. These packages are provided by WordPress Core, or by any other plugin, like Gutenberg or WooCommerce. And yes, WordPress plugins have the capability to override packages and provide their own implementation.

The import means something like a global.getPackage( '@wordpress/components' ) function call, nothing else. There is some global registry that provides them. And the webpack build does exactly this kind of transformation to the imports.

It's similar to doing a native import from 'fs' on Node.js, or using some native API in an iOS app, like, to choose a completely random function, AVCaptureDevice.focusPointOfInterest. Of course, if I run the app on an old or incompatible version of Node or iOS, these API calls "could become anything" and they might work incorrectly or simply crash the app. I don't know, is there really anything surprising about that? The situation with importing WordPress APIs is exactly the same.

Could help the developer make a conscious decision whether to use it or not.

The plugin developer doesn't really have that choice, to bundle the platform packages. Just like they don't bundle the native fs module or the iOS AVFoundationFramework. These packages often contain singletons, e.g., @wordpress/components contains various React context providers or the map that connect Slots to Fills, and if you bundle your own copy, the plugin won't work.

Declaring "@wordpress/components": "17.0.0" in a package.json means "I'd like to use version 17 of the API", but this has two problems:

  • the build doesn't really check any compatibility. Maybe there are TS type definitions in the package that might trigger some errors, maybe unit tests reveal that there is API mismatch, but that's all and it's unreliable. Other than types and unit tests, the package in node_modules is completely ignored.
  • it's very hard to give any meaning to the number 17. It's not related to anything observable on an actual WordPress site. Neither to the Core version installed, nor to the version of the Gutenberg or WooCommerce plugin installed.

One additional problem is the documentation of the DEW plugin? The name describes more its internal workings rather than the actual value it provides, and the README is not particulary lucid either.

So, I think working on these three problems could help us move forward?

@gziolo
Copy link
Member

gziolo commented Oct 19, 2021

@jsnajdr, thank you for a more detailed explanation of my previous comment #35630 (comment). I love the reference to Node.js API. This is exactly how we should think about it.

Other than types and unit tests, the package in node_modules is completely ignored.

It's also helpful for linting or hints in IDEs. When using the build tools with DEWP configured to handle all default externals, from the production code perspective, you don't need those packages to be installed in your project.

@scinos
Copy link
Contributor

scinos commented Oct 19, 2021

I totally understand the package will be provided by the host environment in production. I think that point is quite clear.

My original question is if I can pull in the same dependency for my dev environment. There are many many benefits to it, some of them already mentioned (unit testing, typechecking, IDE hints, linting...)

I assumed that the answer was "yes", that there was a NPM package @wordpress/[email protected] that contains the exact same code that the host will provide at runtime via global.getPackage( '@wordpress/components' ) or similar. Some of the solutions proposed above assume @wordpress/[email protected] exists, and it is only a matter of deciding how to surface that info to plugin developers (for example, using dist-tags as proposed in #24376)

However, this comment (and this issue) made me doubt it:

The other challenge is that it isn't mandatory to publish to npm the version of WP packages that the Gutenberg plugin uses.

I'm not sure I fully understand the comment, but seems to suggest that @wordpress/[email protected] may not exist. That's why I asked for an elaboration.

@gziolo
Copy link
Member

gziolo commented Oct 19, 2021

I assumed that the answer was "yes", that there was a NPM package @wordpress/[email protected] that contains the exact same code that the host will provide at runtime via global.getPackage( '@wordpress/components' ) or similar. Some of the solutions proposed above assume @wordpress/[email protected] exists, and it is only a matter of deciding how to surface that info to plugin developers (for example, using dist-tags as proposed in #24376)

It's possible when working with WordPress core to match the same list of package dependencies that get externalized. However, the applicability is limited to the cases that we covered with @jsnajdr.

However, this comment (and this issue) made me doubt it:

The other challenge is that it isn't mandatory to publish to npm the version of WP packages that the Gutenberg plugin uses.

I'm not sure I fully understand the comment but seems to suggest that @wordpress/[email protected] may not exist. That's why I asked for an elaboration.

The block editor in WordPress core gets updated through npm packages so the publishing is tight to that. However, for the Gutenberg plugin, all the source is there so there is no need to publish to npm whenever a new version of the plugin gets released to the WordPress plugin directory. On principle, we can't match npm releases with Gutenberg plugin releases. Sometimes, we could, but it is completely unreliable – in particular during the beta/rc release cycle for WordPress major release.

@scinos
Copy link
Contributor

scinos commented Oct 19, 2021

These packages are provided by WordPress Core, or by any other plugin, like Gutenberg or WooCommerce. And yes, WordPress plugins have the capability to override packages and provide their own implementation.

That would mean that the version of @wordpress/components provided by a host running WP 1.0.0 and GB 1.0.0 may be different form the version provided by a host running WP 1.0.0 and GB 2.0.0, right? (or any other plugin, of course). That's an interesting problem.

However, for the Gutenberg plugin, all the source is there so there is no need to publish to npm whenever a new version of the plugin gets released to the WordPress plugin directory

Is there anything stopping us to also publish each package to npm at that point?

@jsnajdr
Copy link
Member

jsnajdr commented Oct 19, 2021

I totally understand the package will be provided by the host environment in production. I think that point is quite clear.

I also wanted to clarify that the plugin developer doesn't really have that much of a choice whether they want to bundle a @wordpress/* package or to use the one provided by the host environment. Bundling might work some of the time, but the only reliable approach is to externalize.

I'm not sure I fully understand the comment, but seems to suggest that @wordpress/[email protected] may not exist.

If this really happens and it happens often enough, it's just a matter of improving our NPM release discipline, isn't it? I don't know how often @gziolo and others publish packages to NPM and how is that synchronized with Gutenberg releases. In principle we can publish a matching set of NPM packages with every release, and have perfect 1:1 version mapping between them.

That would mean that the version of @wordpress/components provided by a host running WP 1.0.0 and GB 1.0.0 may be different form the version provided by a host running WP 1.0.0 and GB 2.0.0, right?

Yes, that's exactly how the Gutenberg plugin works. WP 1.0.0 has a certain version of Gutenberg built-in (you can think of it as a LTS release) and installing the Gutenberg plugin overrides it completely, including all the wp-* scripts. Any plugin has the capability to override any script and register anything else under its symbolic name.

Then there's also the very complex WooCommerce plugin that uses the Core packages (potentially overridden by Gutenberg) and exposes its own set of packages that are available to plugins that extend Woo.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 19, 2021

@nerrad

I'm inferring from your statement that you have some examples of other platforms in mind (on a similar scale as WordPress/Gutenberg) that have wrestled with this problem. If so, what do these other platforms do to solve the problem? Are there things we could learn from those examples?

My experience so far was with mostly native JS and HTML/ Web Components world, but I believe that the principle of the problem is the same regardless of the content of the dependencies. And we don't have to limit ourselves, to a readymade product in a form of a WebPack plugin. The case we have is:
A platform that delivers some functionality itself. Allows multiple vendors to contribute their code, to be run within. A code that consumes the features delivered by the platform, shares dependencies with it, but also provides its own. Own dependencies, or different versions of the same dependencies.

If we agree, that this is the problem we are talking about. Then yes, I did work myself implementing a similar platform. It didn't reach the scale of WordPress when I was there. But I can name a few other examples of IMHO big enough scale.

  • AFAIK Salesforce delivers a platform, where multiple parties use the shared web components, but some parties can use different versions in their scope. (see scoped custom element registries below)
  • I believe a number of enterprise-grade platform providers, share the same concern. The community of contributors to their platform may not be as open as ours. But I believe the number of internal and external parties brings exact same problem.
  • Even Node already mentioned here could give some examples. I can use an API that's already delivered. Then I can easily check if a given feature is available in a targeted version (range) of Node - which is one of the painpoint of the WordPress solution I stated here. If I want to specifically use something newer I add a package, add a polyfill. There are efforts to make it even more convenient by creating standard libraries, to be imported instead of relying on the problematic global scope. See https://github.com/tc39/proposal-built-in-modules, https://github.com/WICG/import-maps
    When I need something else, I add a package dependency, then NPM covers its dependencies, its platform requirements, etc.
  • Web/HTML is such a platform. I hope the scope is now big enough to compare to WordPress ;). It delivers features, allows to overwrite them globally or locally. Import dependencies, re-map those, intercept those, etc. Many parties can contribute together. And the adoption is enormous ;)
    As a web developer, I may say that my code targets a given spec/browser version. Still delivering the code to the browser, I do not have full control over what actual version it is, what extensions are there, what other code is running. But if I say I target Chrome 66+, at least I know which version of Web Components API I can use. I can assert that, test it, etc. I can use the API that's delivered, or add a polyfill.
    Now, coming more granular. If I want to use an HTML (custom) element I can either use what's there already delivered by the application platform in the global registry. Or provide my own definitions. I can use a custom version of something that's already delivered, by registering it in the scoped registry, then use it in scoped part. I can scope a part of my JS (with JS Modules) and part of my HTML (with Shadow DOM), to be sure and safe from the other parties.

We are writing in JS and HTML and both languages give us the primitives to import dependencies, to create scopes, to run the code within those scopes, or to overwrite higher scopes. They give us the primitives to control the scope as well as to invert this control. So, in the way I see the problem, it's not the hard technical limitations, but what we do, how we do, and what we communicate back to plugin developers like me, or to automated tools.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 19, 2021

@jsnajdr

Hey, by using DEWP, your @wordpress/components could become anything"

I sense there is some deep misunderstanding about the purpose of DEWP and what it means to "import @wordpress/components".

I think you're right. Probably, there is.

Initially, I thought DEWP is a convenient plugin to optionally use to reduce the bundle size.
My understanding came from reading its README

This webpack plugin serves two purposes:

  • Externalize dependencies that are available as script dependencies on modern WordPress sites.

I read it as "If WP happen to have your dependency available, it will remove it from the bundle and use them instead". To me @wordpress/components: 1.2.3 is such a dependency, as that's what I stated in my dependencies list. So NPM package of name @wordpress/components and version 1.2.3, not some other package at unknown version.

  • Add an asset file for each entry point that declares an object with the list of WordPress script dependencies for the entry point. The asset file also contains the current version calculated for the current source code.

TBH, this one is still pretty unclear to me. But I read it as "It will create a file that will list the WP script dependencies, whatever >script dependency< is". I don't know why I'd need it but still does look hazardous. The "calculated version" I totally don't get what it is and why it would even matter to me. Is it calculated for the source code of my plugin, of its deps, of "WP script deps"? For sure it's not the version I was looking for, that matches anyhow with my plugin dependencies.

This allows JavaScript bundles produced by webpack to leverage WordPress style dependency sharing without an error-prone process of manually maintaining a dependency list.

Cool, so it will reduce the chance for error, not increase it, right? Especially when it comes to me manually maintaining the version I run on my development/test environment. 🤔

From what you stated right here I was really far from the truth.
As it seems

  • it's mandatory in some way, which wasn't stated to me before. And a number of plugins I contributed to are not using it or excluding a number of packages, that we would like to assert the version of.
  • It does not work on my plugins dependencies, but on internal handles that happen to share the same name occasionally. (see The source of @woocommerce/settings processed by DEWP is confusing woocommerce/woocommerce-admin#7810)
  • It replaces the dependency with a global, which was surprising. And introduces a risk, I usually don't bother at all when I use import ModuleSpecifier statement.
  • Externalizes dependencies that are not even available. I.e externalizes [email protected] even when it's not available, but there is [email protected] instead.
  • "makes JS bundles produced by webpack require an error-prone process of manually inspecting a dependency list"

When using it in a WordPress plugin, the @wordpress/components import doesn't really import anything from node_modules/@wordpress/components […]
The import means something like a global.getPackage( '@wordpress/components' ) function call, nothing else. There is some global registry that provides them. And the webpack build does exactly this kind of transformation to the imports.

Then to me than having dependencies: { '@wordpress/components': '1.2.3' } is irrelevant, and only introduces the confusion. As my plugin would most likely not use this version, and even possibly, not even this package.

Maybe, we should really recommend using dist-tags. State something that could at least happen: "The WordPress's 5.8 global '@wordpress/components' package will be used", so dependencies: { '@wordpress/components': 'wp-5.8' }.
Or even go further, to make clear that import '@wordpress/components' will not import '@wordpress/components' NPM package, but a WP instance's package from a run-time global registry, call explicitly something like: import WP-runtime/components' and do not make NPM package of that name.


if I run the app on an old or incompatible version of Node or iOS, these API calls "could become anything" and they might work incorrectly or simply crash the app. I don't know, is there really anything surprising about that? The situation with importing WordPress APIs is exactly the same.

To me, the difference is that when I run the old version of Node I can check what version of the API is there. The problem stated in the OP is that with WordPress, even though I know the (test/dev/customer) environment is running WP version 5.8.1, and Gutenberg 1.2.3, tracking down which version of @wordpress/components.Card is there, requires few hours of manual digging through releases, git history, to sometimes eventually find out that is some cherry-picked git commit, which I could not easily install on my test env.


The plugin developer doesn't really have that choice, to bundle the platform packages. Just like they don't bundle the native fs module or the iOS AVFoundationFramework. These packages often contain singletons, e.g., @wordpress/components contains various React context providers or the map that connect Slots to Fills, and if you bundle your own copy, the plugin won't work.

That's something really new to me, as I stated above. For a year of developing a plugin that uses @wordpress/components I didn't find any documentation that states: "You should not bundle this package, it's included in WP and has side-effects that shouldn't be duplicated". Its README still suggests installing it locally and does not mention DEWP. I bet I'm not the only one, as there was a number of people coding and reviewing the configs like:
https://github.com/woocommerce/google-listings-and-ads/blob/develop/webpack.config.js#L5-L16
https://github.com/woocommerce/pinterest-for-woocommerce/blob/develop/webpack.config.js#L4-L11


Declaring "@wordpress/components": "17.0.0" in a package.json means "I'd like to use version 17 of the API", but this has two problems:

  • the build doesn't really check any compatibility. Maybe there are TS type definitions in the package that might trigger some errors, maybe unit tests reveal that there is API mismatch, but that's all and it's unreliable. Other than types and unit tests, the package in node_modules is completely ignored.
  • it's very hard to give any meaning to the number 17. It's not related to anything observable on an actual WordPress site. Neither to the Core version installed, nor to the version of the Gutenberg or WooCommerce plugin installed.

And to me, that's the problem I think should be addressed. Either on one end by giving 17 a meaning, or on the other, replacing the meaningless, confusing number with something that is applicable for that case.


One additional problem is the documentation of the DEW plugin? The name describes more its internal workings rather than the actual value it provides, and the README is not particulary lucid either.
So, I think working on these three problems could help us move forward?

👍👍 I think I already expressed in this comment, how confused I get with the current docs :)

I'd love to find the value and reason in the readme. I like to read in the docs why I need it, what it does to me/my project, rather than what it does to code/how it does that.

@tomalec
Copy link
Contributor Author

tomalec commented Oct 19, 2021

for the Gutenberg plugin, all the source is there so there is no need to publish to npm whenever a new version of the plugin gets released to the WordPress plugin directory. On principle, we can't match npm releases with Gutenberg plugin releases. Sometimes, we could, but it is completely unreliable – in particular during the beta/rc release cycle for WordPress major release.

Is there a reason/value for not releasing NPM packages when releasing Gutenberg? Or was it simply no need to put an effort to sync those?

I hope the comments of @scinos and me here, give at least some reason to consider it in the future.

@noahtallen
Copy link
Member

Yeah, compatibility is definitely a big issue with the current approach. There is clearly an expectation (from JS developers) that one can rely on package.json to define compatible versions, but this expectation is not met by DEWP (by design :)). In a way, @wordpress/ packages are more peer dependencies provided not by a consuming application but by the WordPress environment. Developers specifying peer deps would have an expectation of "I need to support multiple versions of this package." That expectation is just not the same when you write "I support only wordpress components v17".

Developers, I think, are approaching @wordpress/ dependencies first from the npm side of things. This makes sense, because DEWP makes you think about it from the npm-first point of view by design. (Especially devs who are starting to work on a pre-existing project.) But this means that developers do not learn/understand that DEWP and npm is simply an enhancement on top of WordPress script enqueues. Ideally, a dev would learn about it from the WordPress point of view first, and then understand that DEWP enhances local development. But the default way to import and specify @wordpress/ packages means that devs come away with the incorrect concept that it works with npm primarily.

As a result, I think the way we specify which packages are provided is not ideal. We want developers to clearly understand they need to support a version range. (And even more specifically, a range of WordPress or Gutenberg plugin versions, not a range of npm package versions.) But the way DEWP interacts with npm/package.json makes that tricky/impossible.

Relatedly, it would be helpful to know at bundle time if your package is properly incompatible -- such as relying on an import only added in the latest version.

Obviously, the inherent design of DEWP makes it hard to solve those problems. But it's still worth thinking about these shortcomings, even if those shortcomings are inherent to the benefits of DEWP.

As a side note, the way package versions relate to the gutenberg plugin + WordPress has always been confusing to me, especially if one is trying to use a package outside of the WordPress environment. There will be several periods where you can't get any updates because npm releases are frozen during the wp release cycle (though I've never been sure why that is).

@jsnajdr
Copy link
Member

jsnajdr commented Oct 26, 2021

(@tomalec sorry for the delayed reply, I've been sick for a few days last week)

Initially, I thought DEWP is a convenient plugin to optionally use to reduce the bundle size.

The trouble with this statement is that it's not completely false 🙂 When working with a dependency like lodash, then yes, you can bundle your own version, and due to tree shaking it can be even smaller and faster because you're not loading the entire library. Or it can end up bigger and slower, because another plugin on the page loads lodash anyway and our plugin is failing to reuse the shared version. And many @wordpress/components can be just like this, self-contained functions that can be duplicated without any danger.

Then there are other packages like @wordpress/hooks that are not libraries, but provide access to some shared global. Imagine that instead of writing window.localStorage.getItem() you need to import a system module like import { getItem } from '@browser/localStorage'. Having that bundled doesn't make sense, at best it will be a local storage mock, but never the "real thing". There are proposals to expose new browser APIs just like this, as system modules. I think I saw it in some datetime or i18n proposal.

@wordpress/components specifically is even more confusing because parts of it are like lodash, and parts of it are like @wordpress/hooks. WooCommerce has been bundling it, probably to not depend too much on the fast-moving Core and Gutenberg, and it mostly works. But sometimes it backfires.

It's all very similar to Windows DLL and their static or dynamic linking. And the "DLL hell" phenomenon associated with that 🙂

But I read it as "It will create a file that will list the WP script dependencies, whatever >script dependency< is". I don't know why I'd need it but still does look hazardous. The "calculated version" I totally don't get what it is and why it would even matter to me.

The *.asset.php file is a manifest that is used by the WordPress wp_register_script/wp_enqueue_script machinery to register the script correctly. A script is enqueued like this:

wp_enqueue_script( 'my-script', 'dist/app.js', array( 'wp-data' ), 'e9f0118ee9' );

And the last two arguments come directly from the .asset.php file. The page will first load the script registered wp-data, as it is a dependency, and then will load the script itself from the dist/app.js?ver=e9f0118ee9 URL. The ver string acts as a cache buster.

In other words, it's an internal file that's used by the WordPress script loading system to do all the plumbing correctly. Similarly, Windows DLLs can have companion XML files called "assembly manifests" that tell Windows a lot of details about how to work with that DLL.

Again, this confirms that the DEWP's README jumps to fast into explaining the internals.

It does not work on my plugins dependencies, but on internal handles that happen to share the same name occasionally.

I still don't understand the @woocommerce/settings issue very well. If you externalize this package, it will compile usages as wc.wcSettings references and will put a wc-settings item into the dependencies array in .asset.php, right? Then that script needs to be registered by some other plugin -- is that WooCommerce itself or is it some more specialized plugin like "WooCommerce Product Blocks"?

For the externalized reference to work, some plugin must call:

wp_register_script( 'wp-settings', 'dist/build.js' );

and that dist/build.js script must create a global:

window.wc.wcSettings = { ... };

Is the issue that some part of that is not happening or that some part of the process is confusing?

tomalec added a commit to tomalec/gutenberg that referenced this issue Oct 27, 2021
…ion of WordPress.

This is to mitigate the problems related to DEWP usage mentioned in WordPress#35630.

This allows a plugin developer to install to-be-extracted dependencies at the lowest version of their plugin targets. To be able to run local tests and linters against that version. To give some insight to the developer, on what is the lowest anticipated version of an individual package.
@gziolo
Copy link
Member

gziolo commented Nov 22, 2021

We landed the PR #35106 from @tomalec that addes an optional feature to @wordpress/dependency-extraction-webpack-plugin that reports all used script dependencies. It should help as a temporary improvement to help navigating through the complexity of core dependencies.

We have now also an issue #36716 for tracking all the work related to adding support for JavaScript Modules and Import Maps in the Gutenberg plugin (and later WordPress Core). It's one of the primary tasks of the newly formed WordPress Performance JavaScript focus group. You can check other initiatives discussed in https://docs.google.com/document/d/1GD0X3bNUa73Afsi8OZjrSDb0VfEgDg-PLlBvw5aL-sc/edit#.

It seems like import maps might help to improve the control over dependencies shipped with WordPress Core, but as @jsnajdr pointed out a good versioning strategy might be hard or impossible in some cases. Let's see how it evolves.

For the issue with the lack of matching npm releases for every possible Gutenberg plugin release, I'm going to propose a revised strategy for npm publishing that will take that into account. It's a more complex task because we will need to automate npm publishing on CI and link it with the existing Gutenberg plugin releases that are handled by GitHub actions.

@tomalec
Copy link
Contributor Author

tomalec commented Dec 13, 2021

Developers, I think, are approaching @wordpress/ dependencies first from the npm side of things. This makes sense, because DEWP makes you think about it from the npm-first point of view by design. (Especially devs who are starting to work on a pre-existing project.) But this means that developers do not learn/understand that DEWP and npm is simply an enhancement on top of WordPress script enqueues. Ideally, a dev would learn about it from the WordPress point of view first, and then understand that DEWP enhances local development.

I think it's a pretty accurate description of the source/path that lead (at least me) to those problems.
But IMO the fact "that developers do not learn/understand […] DEWP […]" is not because devs are doing something wrong, but because the platform is not doing something good enough, or not being described and documented good enough.

That's why the title of my OP is about impaired DevX. As till this day, I didn't find any documentation page that states

  • how a hard-requirement DEWP is
  • what it is, what it does, why, and how

For example

Ideally, a dev would learn about it from the WordPress point of view first, and then understand that DEWP enhances local development.

The DEWP package docs should state and emphasize that. Also

But the default way to import and specify @wordpress/ packages means that devs come away with the incorrect concept that it works with npm primarily.

I think that confusion also originates in the fact that DEWP still uses npm-looking package names instead of script handles, and leaves no trace in the plugin code that would suggest that's not the regular npm package.

As a result, I think the way we specify which packages are provided is not ideal. We want developers to clearly understand they need to support a version range. (And even more specifically, a range of WordPress or Gutenberg plugin versions, not a range of npm package versions.) But the way DEWP interacts with npm/package.json makes that tricky/impossible.

☝️ that IMHO is purely DEWP devX problem, not related to the benefits it aims to deliver or problem it is meant to solve.

Obviously, the inherent design of DEWP makes it hard to solve those problems. But it's still worth thinking about these shortcomings, even if those shortcomings are inherent to the benefits of DEWP.

I really don't agree that those shortcomings are inherent to the benefits of DEWP. Maybe I miss something. Could we please, for the benefit of this discussion, specify a clear, explicit list of benefits & goals of DEWP?

So we would, all know what are the problems to be addressed, and what are just the obstacles that could be removed or addressed otherwise.

Having this list would itself address one DevX problem and AFAIK others (@puntope) faced.

@tomalec
Copy link
Contributor Author

tomalec commented Dec 14, 2021

lodash […] And many @wordpress/components can be just like this, self-contained functions that can be duplicated without any danger.

I think one, devX improvement we could have is that it could be explicitly stated, which dependencies are externalized absolutely MUST be externalized not to lead to conflicts, and which simply are externalized to the same traffic.

Currently, as plugin developers, we need to maintain for months a local copy of a few @…/components or other package exports because we need bug fixes sooner. Waiting for the WP version to catch up. Then it happens once, let's say the version 8.0.0 attached in WP 5.5 is finally released, then the version 8.1.0 attached in WP 5.8 has another bug. So we keep on maintaining local clones, which is a significant development cost. It's also bug-prone, affects UX, etc.

If we would know that we MUST NOT bundle @wordpress/data, but we MAY or may not bundle @wordpress/html-entities or @woocommerce/number it would save us a lot of maintenance effort, bugs, time spent investigating if should or not, etc.

@wordpress/components specifically is even more confusing because parts of it are like lodash, and parts of it are like @wordpress/hooks

So maybe we could separate those. As a new coming developer, I'd expect @wordpress/components to be a library of components to choose and pick, not a state-keeping global/singleton.

The *.asset.php file is a manifest that is used by the WordPress wp_register_script/wp_enqueue_script machinery […]
In other words, it's an internal file that's used by the WordPress script loading system to do all the plumbing correctly.

Thanks for that explanation :)

@tomalec
Copy link
Contributor Author

tomalec commented Dec 14, 2021

I still don't understand the @woocommerce/settings issue very well.

In Google Listings and Ads plugin we started using import something from '@woocommerce/settings'; statement, either:

  • as we had to clone one of the @woocommerce/components, therefore we needed its dependencies
  • as we make our own components that we try to keep aligned with the way WooCommerce components are made.

Naturally, instantly our ESlint warned to add @woocommerce/settings, so we did simple npm i --save @woocommerce/settings.
Given there is an npm package of that name, we were sure that that's correct.

But it turned to be a source of many problems, as @woocommerce/settings loaded by DEWP is a completely different thing. Then finding that thing, also took us some time. I then thought it may be the unpublished @woocommerce/wc-admin-settings package maintained in WC-admin internally, but again it was not the case. It turned out to be wp-content/plugins/woocommerce/packages/woocommerce-blocks/build/wc-settings.js.

The above took a toll on many developers' time, bugs, and a lot of confusion.

I believe such problems could be avoided by better care when it comes to package specifiers, documenting what are the actual packages that are externalized, and some tooling to help DEWP users (plugin developers) introspect what's happening.

@gziolo
Copy link
Member

gziolo commented Jan 10, 2022

For the issue with the lack of matching npm releases for every possible Gutenberg plugin release, I'm going to propose a revised strategy for npm publishing that will take that into account. It's a more complex task because we will need to automate npm publishing on CI and link it with the existing Gutenberg plugin releases that are handled by GitHub actions.

I started a discussion related to that in #37820. Let's tackle this one separately from this isue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Developer Experience Ideas about improving block and theme developer experience [Tool] Dependency Extraction Webpack Plugin /packages/dependency-extraction-webpack-plugin [Type] Enhancement A suggestion for improvement.
Projects
None yet
Development

No branches or pull requests

8 participants