Here's my `npm` command these days.
It reduces the attack surface drastically.
alias npm='docker run --rm -it -v ${PWD}:${PWD} --net=host --workdir=${PWD} node:25-bookworm-slim npm'
- No access to my env vars
- No access to anything outside my current directory (usually a JS project).
- No access to my .bashrc or other files.
This folk-wisdom scapegoating of post-install scripts needs to stop or people are going to get really hurt by the false sense of security it's creating. I can see the reasoning behind this, I really do, it sounds convincing but it's only half the story.
If you want to protect your machine from malicious dependencies you must run everything in a sandbox all the time, not just during the installation phase. If you follow that advice then disabling post-install scripts is pointless.
The supply chain world is getting more dangerous by the minute and it feels like I'm watching a train derail in slow motion with more and more people buying into the idea that they're safe if they just disable post-install scripts. It's all going to blow up in our collective faces sooner or later.
There are so many vectors for this attack to piggyback off from.
If I had malicious intentions, I would probably typo squat popular plugins/lsps that will execute code automatically when their editor runs. A compromised neovim or vscode gives you plenty of user permissions, a full scripting language, ability to do http calls, system calls, etc. Most LSPs are installed globally, doesn't matter if you downloaded it via a docker command.
The above simple alias may work for node/npm, but it doesn't generalize to many other programs available on the local system, with resources that would need to be mounted into the container ...
Not sure how secure this really is, because it's fairly easy to break out of a Docker container with the default settings (due to the fact that the kernel is shared between containers and the host, unlike with VMs). Rootless Docker (or better, Podman) would improve security greatly.
I always wondered why people found it acceptable to just run npm on their systems for anything they do, and have it download anything on any build.
Coming from "make" with repeatable and predictable builds, I was appalled that you run this thing and you have no idea what it will download and what it will produce. Could be something different the next time you run it! Who knows!
I also found it bizarre that even for things that just generate CSS, you are expected to integrate it into your npm thing. I mean, generating CSS from a configuration is a function. It's a make recipe. Why does it need to be an npm library dependency with all the associated mess? I managed to work around it for a number of packages by freezing entire npm setups inside docker containers, which gave me reproducible builds — but it's a losing battle.
Downloading things that you don't know about its common for every modern package manager out there, from maven, to nuget, to pip and npm. This is the new norm and there are reasons for it.
I don't think the old C/C++ way of relying on distro package managers would allow for the fast ecosystems people work nowadays.
Things are changing though, and people are pushing for more secure package managers with the same feature set as the old ones, which is possible.
I think this is the missing piece on the first wondering part of your comment. But I don't think we should be bashing the means without understanding the reasons
The entire Front end dev seems like trust me bro Wild West to me. All feels like endless layers of duct tape. Which I guess in a way it is given the evolution of browsers
'Npm install thing' is easier than programming or even auditing imported code.
Too lazy to write, too cheap to pay for it, that's half of open source.
Now what's going to dominate stackoverflow answers, thoughtful articles on how to program? Or gratis npm/uv installable libraries that hide the actual solution details in between tests and polyfills and readmes, that end up just download another dependency while contributing enough lines of code so that the author can safely put it in their resume and grift over the oss ecosystem to get a cushy 300k/yr job at an extended faang where they are in charge of an ad pixel tracker or yet another one of those wallet things that are just an IO proxy for money but they get to keep some comission if they spend enough money on ads to convince someone to use the thing once and then it becomes their main neo(not) bank* forever.
*for regulatory reasons we must not call it a bank, but it's a bank
>When you run npm install, npm doesn't just download packages. It executes code. Specifically, it runs lifecycle scripts defined in package.json - preinstall, install, and postinstall hooks.
What's the legitimate use case for a package install being allowed to run arbitrary commands on your computer?
> What's the legitimate use case for a package install being allowed to run arbitrary commands on your computer?
The paradigm itself has been in package managers since DEB and RPM were invented (maybe even Solaris packages before that? it's been a minute since I've Sun'd); it's not the same as NPM, a more direct comparison is the Arch Linux AUR - and the AUR has been attacked to try and inject malware all year (2025) just like NPM. As of the 26th (5 days ago) uploads are disabled as they get DDOSed again. https://status.archlinux.org/ (spirit: it's the design being attacked, pain shared between AUR and NPM as completely disparate projects).
More directly to an example: the Linux kernel package. Every Linux distribution I'm aware of runs a kernel package post-install script which runs arbitrary (sic) commands on your system. This is, of course, to rebuild any DKMS modules, build a new initrd / initramfs, update GRUB bootloader entries, etc. These actions are unique outputs to the destination system and can't be packaged.
I have no data in front of me, but I'd speculate 70% (?) of DEB/RPM/PKG packages include pre / post install/uninstall scriptlets, it's very common in the space and you see it everywhere in use.
There's nothing inherently wrong with that. The problem is npm allows any random person to upload packages. It's completely untrusted. Contrast that with Linux distributions which have actual maintainers who take responsibility for their packages. They don't generally allow malware to make it into the official software repositories. In many cases they went as far as meeting each other in person just to set up a decentralized root of trust with PGP. It's so much better and more trustworthy.
The truth is npm, pip, rubygems, cargo and all the other programming language package managers are just fancier versions of the silly installation instructions you often find in README files that tell people to curl some script and pipe it into bash.
Easy example that I know of: the Mediasoup project is a library written in C++ for streaming video over the internet. It is published as a Node package and offers a JS API. Upon installing, it would just download the appropriate C++ sources and compile them on the spot. The project maintainers wanted to write code, not manage precompiled builds, so that was the most logical way of installing it. Note that a while ago they ended up adding downloadable builds for the most common platforms, but for anything else the expectation still was (and is, I guess) to build sources at install time.
Many front end tools are written in a faster language. For example, the next version of TypeScript compiler, SASS, SWC (minifier), esbuild (bundler used by Vite), Biome (formatter and linter), Oxc (linter, formatter and minifier), Turbopack (bundler), dprint (formatter), etc.
They use proinstall script to fetch pre-built binaries, or compile from source if your environment isn't directly supported.
I feel super uneasy developing Software with Angular, Vue or any framework using npm. The amount of dependencies these frameworks take is absolutely staggering. And just by looking at the dependency tree and thousands of packages in my node_modules folder, it is a disaster waiting to happen. You are basically one phishing attack on a poor open source developer away from getting compromised.
To me the entire JavaScript ecosystem is broken. And a typo in your “npm -i” is sufficient to open up yourself for a supply-chain attack. Could the same happen with NuGet or Maven? Sure!
But at least in these languages and environments I have a huge Standard Library and very few dependencies to take. It makes me feel much more in control.
Go kinda solves this by using repo links instead of package names. This forces you to go through the repo and copy paste the url (instead of manually typing it out), but it's not bulletproof I guess.
Given the recent npm attacks, is it even safe to develop using npm. Whenever I start a react project, it downloads hundreds of additional packages which I have mo idea about what they do. As a developer who has learnt programming as a hobby, is it better to stick to some other safe ways to develop front end like thyme leaf or plain js or something else.
When I build backend in flask or Django, I specifically type the python packages that I need. But front end development seems like a Pandora box of vulnerabilities
Keep in mind that the vast majority of the 86,000 downloads are probably automated downloads by tools looking for malicious code, or other malicious tools pulling every new package version looking for leaked credentials.
When I iterate with new versions of a package that I’ve never promoted anywhere, each version gets hundreds of downloads in the first day or two of being published.
86,000 people did not get pwnd, possibly even zero.
As a hobbyist how do I stay protected and in the loop for breaches like this? I often follow guides that are popular and written by well-respected authors and I might be too flippant with installing dependencies trying to solve a pain point that has derailed my original project.
Somewhat related, I also have a small homelab running local services and every now and then I try a new technology. occasionally I’ll build a little thing that is neat and could be useful to someone else, but then I worry that I’m just a target for some bot to infiltrate because I’m not sophisticated enough to stop it.
Because these are fetching dependencies in the lifecycle hooks, even if they are legitimate at the moment there is no guarantee that it will stay that way. The owner of those dependencies could get compromised, or themselves be malicious, or be the package owner waiting to flip the switch to make existing versions become malicious. It's hard to see how the lifecycle hooks on install can stay in their current form.
> Many of the dependencies used names that are known to be “hallucinated” by AI chatbots. Developers frequently query these bots for the names of dependencies they need. LLM developers and researchers have yet to understand the precise cause of hallucinations or how to build models that don’t make mistakes. After discovering hallucinated dependency names, PhantomRaven uses them in the malicious packages downloaded from their site.
I found it very interesting that they used common AI hallucinated package names.
Happy I keep a mirror of my deps, that I have to "manually" update. But also, the download numbers are not really accurate for actual install count - for example each test run could increment.
The npm ecosystem's approach to supply chain security is criminally negligent. For the critical infrastructure that underpins the largest attack surface on the Internet you would think that this stuff would be priority zero. But nope, it's failing in ways that are predictable and were indeed predicted years ago. I'm not closely involved enough with the npm community to suggest what the next steps should be but something has to change, and soon.
I wonder what could one do if he wants to use NPM for programming with a very popular framework (like Angular or Vue) and stay safe. Is just picking a not very recent version of the top level framework (Angular, etc.) enough? Is it possible to somehow isolate NPM so the code it runs, like those postinstall hooks, doesn't mess with your system, while at the same time allowing you to use it normally?
One option to make it a little safer is to add ignore-scripts=true to a .npmrc file in your project root. Lifestyle scripts then won't run automatically. It's not as nice as Pnpm or Bun, though, since this also prevents your own postinstall scripts from running (not just those of dependencies), and there's no way to whitelist trusted packages.
These malicious npm packages often “phone home” during install or runtime, opening C2 channels, exfiltrating env vars, or beaconing quietly in the background. Static checks and SBOMs rarely catch that kind of dynamic behavior. With AI-generated or auto-installing code pipelines, that risk gets amplified since installs happen more often and less predictably. Watching what packages do at runtime feels like the next frontier here.
Cybersecurity professionals are police officers and detectives first, technologists second. They are supposed to catch criminals who abuse technology. I find it distasteful to blame a tool and an entire community for things that criminals are responsible for.
There is nothing about Go, or Cargo, or Nuget, or any of the other systems that prevents criminals from committing crimes. Some cleverness can slow them down, or block certain roads to exploiting these systems, but managing risk in a supply chain is essential no matter which technology ecosystem is used.
Let's stop name-calling and throwing shade at specific tools and communities and get better at educating new-comers to the trade, and shaming and ostracizing the cyber-criminals who are causing the problems.
Is there a way to detect/filter dependencies that use HTTP URLs as dependency specifiers as part of an NPM install? Since you can send specific requesters different payloads, I can see how this would bypass most of the normal scanning tools.
[+] [-] ashishb|4 months ago|reply
[+] [-] phiresky|4 months ago|reply
Also I can recommend pnpm, it has stopped executing lifecycle scripts by default so you can whitelist which ones to run.
[+] [-] dns_snek|4 months ago|reply
If you want to protect your machine from malicious dependencies you must run everything in a sandbox all the time, not just during the installation phase. If you follow that advice then disabling post-install scripts is pointless.
The supply chain world is getting more dangerous by the minute and it feels like I'm watching a train derail in slow motion with more and more people buying into the idea that they're safe if they just disable post-install scripts. It's all going to blow up in our collective faces sooner or later.
[+] [-] bitbasher|4 months ago|reply
If I had malicious intentions, I would probably typo squat popular plugins/lsps that will execute code automatically when their editor runs. A compromised neovim or vscode gives you plenty of user permissions, a full scripting language, ability to do http calls, system calls, etc. Most LSPs are installed globally, doesn't matter if you downloaded it via a docker command.
[+] [-] kernc|4 months ago|reply
I use sandbox-run: https://github.com/sandbox-utils/sandbox-run
The above simple alias may work for node/npm, but it doesn't generalize to many other programs available on the local system, with resources that would need to be mounted into the container ...
[+] [-] sthuck|4 months ago|reply
[+] [-] a022311|4 months ago|reply
[+] [-] lelanthran|4 months ago|reply
[+] [-] redat00|4 months ago|reply
[+] [-] silverwind|4 months ago|reply
[+] [-] genpfault|4 months ago|reply
If only :(
[+] [-] Intralexical|4 months ago|reply
[+] [-] jwr|4 months ago|reply
Coming from "make" with repeatable and predictable builds, I was appalled that you run this thing and you have no idea what it will download and what it will produce. Could be something different the next time you run it! Who knows!
I also found it bizarre that even for things that just generate CSS, you are expected to integrate it into your npm thing. I mean, generating CSS from a configuration is a function. It's a make recipe. Why does it need to be an npm library dependency with all the associated mess? I managed to work around it for a number of packages by freezing entire npm setups inside docker containers, which gave me reproducible builds — but it's a losing battle.
[+] [-] gear54rus|4 months ago|reply
Did you then spend a month or two inspecting that setup? Did you do that after every dependency upgrade?
Also both NPM and PNPM already freeze dependencies for you by default without any arcane docker setups.
[+] [-] pluto_modadic|4 months ago|reply
[+] [-] augusto-moura|4 months ago|reply
I don't think the old C/C++ way of relying on distro package managers would allow for the fast ecosystems people work nowadays.
Things are changing though, and people are pushing for more secure package managers with the same feature set as the old ones, which is possible.
I think this is the missing piece on the first wondering part of your comment. But I don't think we should be bashing the means without understanding the reasons
[+] [-] alt227|4 months ago|reply
How does that work when enevitably those npm packages are shown to have vulnerabilites?
[+] [-] Havoc|4 months ago|reply
[+] [-] TZubiri|4 months ago|reply
Too lazy to write, too cheap to pay for it, that's half of open source.
Now what's going to dominate stackoverflow answers, thoughtful articles on how to program? Or gratis npm/uv installable libraries that hide the actual solution details in between tests and polyfills and readmes, that end up just download another dependency while contributing enough lines of code so that the author can safely put it in their resume and grift over the oss ecosystem to get a cushy 300k/yr job at an extended faang where they are in charge of an ad pixel tracker or yet another one of those wallet things that are just an IO proxy for money but they get to keep some comission if they spend enough money on ads to convince someone to use the thing once and then it becomes their main neo(not) bank* forever.
*for regulatory reasons we must not call it a bank, but it's a bank
[+] [-] crtasm|4 months ago|reply
What's the legitimate use case for a package install being allowed to run arbitrary commands on your computer?
Quote is from the researchers report https://www.koi.ai/blog/phantomraven-npm-malware-hidden-in-i...
edit: I was thinking of this other case that spawned terminals, but the question stands: https://socket.dev/blog/10-npm-typosquatted-packages-deploy-...
[+] [-] styanax|4 months ago|reply
The paradigm itself has been in package managers since DEB and RPM were invented (maybe even Solaris packages before that? it's been a minute since I've Sun'd); it's not the same as NPM, a more direct comparison is the Arch Linux AUR - and the AUR has been attacked to try and inject malware all year (2025) just like NPM. As of the 26th (5 days ago) uploads are disabled as they get DDOSed again. https://status.archlinux.org/ (spirit: it's the design being attacked, pain shared between AUR and NPM as completely disparate projects).
More directly to an example: the Linux kernel package. Every Linux distribution I'm aware of runs a kernel package post-install script which runs arbitrary (sic) commands on your system. This is, of course, to rebuild any DKMS modules, build a new initrd / initramfs, update GRUB bootloader entries, etc. These actions are unique outputs to the destination system and can't be packaged.
I have no data in front of me, but I'd speculate 70% (?) of DEB/RPM/PKG packages include pre / post install/uninstall scriptlets, it's very common in the space and you see it everywhere in use.
[+] [-] matheusmoreira|4 months ago|reply
The truth is npm, pip, rubygems, cargo and all the other programming language package managers are just fancier versions of the silly installation instructions you often find in README files that tell people to curl some script and pipe it into bash.
[+] [-] j1elo|4 months ago|reply
[+] [-] squidsoup|4 months ago|reply
https://github.com/orgs/pnpm/discussions/8945
[+] [-] ChrisMarshallNY|4 months ago|reply
I assume that it's heavily sandboxed, though, so it may be difficult to leverage.
[0] https://docs.swift.org/swiftpm/documentation/packagemanagerd...
[1] https://developer.apple.com/documentation/packagedescription
[+] [-] zahlman|4 months ago|reply
It pains me to remember that the reason LLMs write like this is because many humans did in the training data.
[+] [-] bandrami|4 months ago|reply
[+] [-] DangitBobby|4 months ago|reply
[+] [-] vorticalbox|4 months ago|reply
[0] https://www.npmjs.com/package/mongodb-memory-server
[+] [-] maxloh|4 months ago|reply
They use proinstall script to fetch pre-built binaries, or compile from source if your environment isn't directly supported.
[+] [-] bytefish|4 months ago|reply
To me the entire JavaScript ecosystem is broken. And a typo in your “npm -i” is sufficient to open up yourself for a supply-chain attack. Could the same happen with NuGet or Maven? Sure!
But at least in these languages and environments I have a huge Standard Library and very few dependencies to take. It makes me feel much more in control.
[+] [-] kigiri|4 months ago|reply
[+] [-] ramon156|4 months ago|reply
[+] [-] ab_testing|4 months ago|reply
When I build backend in flask or Django, I specifically type the python packages that I need. But front end development seems like a Pandora box of vulnerabilities
[+] [-] jtokoph|4 months ago|reply
When I iterate with new versions of a package that I’ve never promoted anywhere, each version gets hundreds of downloads in the first day or two of being published.
86,000 people did not get pwnd, possibly even zero.
[+] [-] 650REDHAIR|4 months ago|reply
Somewhat related, I also have a small homelab running local services and every now and then I try a new technology. occasionally I’ll build a little thing that is neat and could be useful to someone else, but then I worry that I’m just a target for some bot to infiltrate because I’m not sophisticated enough to stop it.
Where do I start?
[+] [-] robpco|4 months ago|reply
[+] [-] gbransgrove|4 months ago|reply
[+] [-] creativeSlumber|4 months ago|reply
I found it very interesting that they used common AI hallucinated package names.
[+] [-] worik|4 months ago|reply
I have used Node, I would not go near the NPM auto install Spyware service.
How is it possible that people keep this service going, when it has been compromised so regularly?
How's it possible that people keep using it?
[+] [-] amiga386|4 months ago|reply
[+] [-] edoceo|4 months ago|reply
[+] [-] 2d8a875f-39a2-4|4 months ago|reply
[+] [-] severino|4 months ago|reply
[+] [-] theodorejb|4 months ago|reply
[+] [-] belikebakar|4 months ago|reply
[+] [-] mayhemducks|4 months ago|reply
There is nothing about Go, or Cargo, or Nuget, or any of the other systems that prevents criminals from committing crimes. Some cleverness can slow them down, or block certain roads to exploiting these systems, but managing risk in a supply chain is essential no matter which technology ecosystem is used.
Let's stop name-calling and throwing shade at specific tools and communities and get better at educating new-comers to the trade, and shaming and ostracizing the cyber-criminals who are causing the problems.
[+] [-] unknown|4 months ago|reply
[deleted]
[+] [-] lordofgibbons|4 months ago|reply
[+] [-] heelix|4 months ago|reply