It’s bad because this leads to a standard React project having over 2,000 dependencies.
The real question is: do you -really- need an external lib with 20 dependencies just to show a freakin’ loading spinner? Remaking the wheel is bad but so is never making truly simple things yourself, or just not using them.
What happens when a common package breaks? What happens if it gets hijacked and becomes a security vector that’s impossible to spot because it’s loaded as the 567th package in a dependency tree?
The answer here is to have a strong stdlib where do you don’t need to pull in 3rd party packages all the time for trivial things, and not including a million small packages in every single project.
> The real question is: do you -really- need an external lib with 20 dependencies just to show a freakin’ loading spinner? Remaking the wheel is bad but so is never making truly simple things yourself, or just not using them.
So the problem is the sheer number of dependencies? What is a reasonable upper limit?
Yes, javascript should continue to standardize commonly used features, but avoiding dependencies doesn't seem to be a solution.
If anything, more dependencies are a good sign because they imply that other people have spent more time and effort on a solution than anything you'll be able to hand-roll for single-use.
It sounds like the root issue here is just dependency management. If our package managers were solving this issue well enough, there should be no practical difference between 2 big dependencies with significant functionality (and more code to review) or 20 tiny, easy-to-review dependencies.
From a security perspective, minimising dependencies is preferred. I have to review at least monthly all our dependencies for published vulnerabilities and new versions.
We don’t allow automatic upgrading of packages/dependencies due to the risk of malicious code making it in (see https://www.npmjs.com/advisories for examples). Yeah there are companies that will help manage your vulnerability process but it’s still a lot of overhead and only grows as the number of dependencies grows.
There’s also the whole left-pad mess from a few years back which shows you always need local archived copies of any dependencies you use.
If you care about security, you should evaluate your dependencies.
This preferably means reading the code you're pulling in but it's unrealistic that we're going to read 2000 constantly changing dependencies for every deploy, so you need to establish trust some how.
Reputation of maintainers, dep CVE scanning, SAST and protective monitoring can all add additional assurance, but they won't protect you from a random hijacked npm module, and the more you include and from more people, the more likely you'll be affected by a zero day.
Having an entire community depend on tiny libraries that do nearly nothing exaperates this problem, and if you use something that almost nobody else does, you're unlikely to be saved by npm audit.
I don't use node daily driver, but I assume npm audit growing to include reputations and frameworks owning more of the dependency tree will help, but the users of the system also need to be considerate of the risks they take and the trust they place.
I used to have all npm modules in source control (SCM), until npm introduced tree shaking. I'm using ZFS which have both de-duplication and compression, so I gain very little by tree shaking. I wish there was a way to disable tree shaking in npm, it's really the source of all evil.
Anyway, I reviewed all code diffs after each npm update, very little changed, eg. it wasn't that much work. But it's now impossible as npm moves files around.
I also deleted a lot of unnecessary files. About 90% of the content of each NPM packages are not needed.
Another reason why I stopped hosting dependencies in SCM is native modules. I wish Node.JS itself could become a platform layer so that I wouln't have to use these native modules.
Another thing is compile to JS, where a tiny change in the source might cause a huge diff in the generated JS.
I committed my dependencies until recently too, and I've been trying to figure out a better alternative. Are you doing something else now rather than committing dependencies? (Right now I'm just using a hodgepodge of npm scripts to lock things down.)
There is an enormous difference between 2 large packages and 20 small ones. I can see the main authors more clearly, and not have to worry about 20 different packages being compromised.
Your statement makes no sense. More packages does not, in any way, shape, or form, correlate to “better quality”. The sole thing it shows is that the author pulled in more packages. That’s it. Whether it’s good or bad or secure or not secure is only determined by analysis. The author could easily have been super lazy and pulled in 18 small packages instead of writing small helper functions manually, etc.
There is no defensible argument for more packages and bloat. None. As I said above, the core lib should focus on providing functionality so pulling in a fancy spinner library means you pull in ONE library, not 20.
The real question is: do you -really- need an external lib with 20 dependencies just to show a freakin’ loading spinner? Remaking the wheel is bad but so is never making truly simple things yourself, or just not using them.
What happens when a common package breaks? What happens if it gets hijacked and becomes a security vector that’s impossible to spot because it’s loaded as the 567th package in a dependency tree?
The answer here is to have a strong stdlib where do you don’t need to pull in 3rd party packages all the time for trivial things, and not including a million small packages in every single project.