Many modern JavaScript development tools are distributed through NPM and tell you to install them globally. For some packages it is okay to install them globally, but other packages are creating a future headache for you. I will explain the problem with global NPM packages, how to tell when it is okay, and how to use locally installed packages instead.
Update NPM 5.2.0 introduced
npx
,
which is a great solution to this problem.
Background
When you run npm install --save <pkg>
or npm install --save-dev <pkg>
, the
package is installed in the node_modules
folder inside your project, and your
package.json
is modified to list the package as a dependency. This gives you
two benefits:
- Isolation: Each project on your computer can have a different version of the same package
- Reproducibility: The next person to download and setup the project will get a roughly similar version of the package, meaning the project should work mostly the same way.
This works really well for libraries, but there's a problem when working with a
command line tool installed through NPM. When you run a command line tool like
eslint
, your shell looks in all the folders listed in $PATH
to find it. The
wrinkle is that NPM installs the executable in your project's
node_modules/.bin/
folder, which is almost never in your $PATH
, so you can't
just type eslint
and have it work. This is why I believe many projects
advocate installing their packages globally (AKA npm i -g
).
The Problem
When you install a package globally, NPM puts the executable in the same folder
as npm
. This folder is probably in your $PATH
, which means you can type
eslint
into your shell and it just works. Another difference is that the
package is not added to your package.json
file, so there's no record of your
project depending on it. It's great to have a working program, but it throws
away the isolation and reproducibility that local package installation gives you:
- All projects on your computer now have to use the same version of the package. If a new version comes out that breaks compatibility, will you have time to upgrade all your projects at once?
- The next person to download the project will have to guess at the version of the package that works. If you try to set up the project in a year, and many versions of the package have been released, how long will it take to find a working version? How will you even know the package is needed, since it's not listed as a dependency?
Solutions
So how do you maintain isolation and reproducibility, while still having easy access to your command line tools? I achieve this with two different solutions:
npm run-script
NPM has a built in command called npm run-script
(shortened to npm run
) for
executing shell commands for you. This is a great way to document common
commands, and alias complex commands to something shorter. The other, lesser
known, benefit is that npm run
automatically adds your project's
node_modules/.bin/
folder to the $PATH
before it runs the command.
If you edit your package.json
to include a scripts section like:
{
...
"scripts": {
"eslint": "eslint lib"
}
...
}
Then you can run npm run eslint
, and it works!
npm-exec
The npm run-script
solution doesn't always work. For example, you might have a
plugin in your text editor that runs eslint
and expects to find it in $PATH
.
I wrote a bash script called npm-exec
to automate running local NPM
executables as if they were installed globally.
#!/bin/bash
# Finds and executes a node program installed locally
# Makes it so you don't have to install it globally and risk having the wrong
# version for your project. This is not meant to be run directly, but through
# a symlink named after the program you want to run. e.g. "eslint"
PROGRAM=`basename "$0"`
DIR=`pwd`
while [ ! -x "$DIR/node_modules/.bin" ]; do
DIR=`dirname "$DIR"`
if [ "$DIR" = "/" ]; then
echo "Local node program not found: $PROGRAM" 1>&2
exit 1
fi
done
exec "$DIR/node_modules/.bin/$PROGRAM" "$@"
To install this:
- Download the script, and make it executable with
chmod a+x npm-exec
- Create a symlink to it in a folder in your
$PATH
, but name it after the NPM package's executable you want to work globally:ln -s /path/to/npm-exec /path/to/bin/eslint
This script looks in your current working directory, for a node_modules/.bin/
folder with an executable that has it's own name, then runs it. If it doesn't
find one, it tries the folder above the current folder, then the folder above
that, and so on.
Why Do Some Projects Recommend Global Installation?
It's certainly quick and easy to start using a new tool that's globally installed, which is why a lot of "getting started" guides recommend it. Also, the problems you'll experience might take you weeks or months to notice.
I'd like to see the community switch documentation to specify local installation of packages whenever practical.
Packages that Advocate Global Installation
These packages tell you to install them globally. Since these packages are tightly coupled to your project, I would say it's a bad idea to install them globally, and you should install them locally instead.
Packages that Advocate Local Installation
These packages have documentation showing local installation and how to work
around the $PATH
problem. They get a gold star!
Packages with a Hybrid Approach
These packages have a hybrid approach where their global module is mostly a thin
stub to fix the $PATH
problem, whilst the rest of their code is installed in a
local module. This is a pretty good workaround.
Tools that Should Be Global
These packages ask you to install them globally, and that's okay! The reason
it's okay for these packages, is that they aren't tightly coupled to your
project. Most of them are scaffolding tools, which create a new project and
don't mess with it afterwards. Another category where it seems okay to install
globally is utility packages like nodemon
which doesn't know anything about
your project.
Summary
Installing modules globally is headache that can be avoided with npm run-script
. We need to improve documentation to show local installation
whenever practical.