Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rust's package managment (cargo) is the best thing I have ever seen of it's kind. The very basic thing you can do is: cargo new funkyproject

Which creates a new barebones rust project called "funkyproject". Every dependency specified in it's Cargo.toml will be automatically downloaded at build (if there is a new version).

When a build is sucessful the versions of said dependency will be saved into a Cargo.lock file. This means if it compiles for you, it should compile on every other machine too.

A cargo.toml allows you also to use (public or private) repositories as a sorce for a library, specify wildcard version numbers to only select e.g. versions newer than 1.0.3 and older than 1.0.7 etc.

Because the compiler will show you unused dependencies you never really end up including anything you don't use. In practice this system does not only work incredibly well, but is also very comfortable to use and isolates it self from the system it is running on quite well.

I really wish Python also had something like this. Pipenv is sort of going into that direction, but it is nowhere near cargo in functional terms.



> Every dependency specified in it's Cargo.toml will be automatically downloaded at build (if there is a new version).

Why do people want this? The builds are no longer reproducible, security and edge case issues can come out of nowhere, api changes from an irresponsible maintainer can break things, network and resource failure can break the build, it's just a terrible idea.

The proper use of a semvar system is entirely optional and unenforceable and seen people been bitten countless times by some developer breaking their package and having everyone complaining ... If the tool didn't do the stupid thing of just randomly downloading crap from the internet none of this would be a problem.

I presume all my dependences are buggy...I just know that the current ones don't have bugs that I have to deal with now. You swap out new code and who the heck knows, it becomes my job again. It's more work because of a policy that doesn't make sense.

Newer code isn't always better. People try new ideas that have greater potential but for a while the product is worse. That's fine, I do it all the time. But I sure as hell don't want software automatically force updating dependency code to the latest lab experiment.

Cities, power plants, defence systems, satellites, and airplanes run on software from the 80s; they don't break because a new version of some library had bugs and it automatically updated, no. They fucking work.

There's a giant inherent huge irreplaceable value in predictability and this approach ignores all those lessons.


Reproducibility was a core concern for cargo. Your parent is incorrect. A lock file means that your dependencies are never updated unless you explicitly ask for an update.


There is also cargo vendor to download the dependencies locally. I’m using just that at work to ensure builds without network access work.

Rust is no worse here than say Haskell with cabal or stack or Swift with whatever they were using I forgot or go for that matter.


You're misreading your parent. The download only happens for the first build using a new dependency. As they mention, once the version is written into the Cargo.lock file, that is the exact version that is used until there is an explicit update step run.


What does "if there is a new version" mean then? If it's a new dependency, there's no old version.


Sorry, english is not my first language, I meant this: When you build initially the used dependencies get downloaded. Only if you [A] update, [B] add a new dependency or [C] clean your project and build it again there will be new things downloaded.

If you update the versions in your Cargo.lock are ignored and updated if the build is sucessful.

If you add a dependency only that depndency is downloaded, the rest is kept as you had it.

If you clean it is as if you cloned that project fresh with git and you will have to download all dependencies. If there is a lockfile the exact versions from it will be used.

To me this is extremely flexible and works very well AND you get precise control over versions if you want it. By the way it is also possible to clone all dependencies and keep a local copy of them, so you are really 100% sure that nothing could ever change with them. Although I am quite sure crates.io doesn't allow changes without version number change, which means you should be save as long as you rely on the version number.


Yes, I suppose that's rather misleading, and that sentence contradicts with the actual behaviour that described later in the original comment. For a fixed set of dependencies, versions are only checked and changed on an explicit 'cargo update' run.


It's a good thing Cargo has lockfiles!


npm (and yarn) literally does exactly all of this, via `npm init funkyproject` and `package-lock.json`.


Except that npm will gladly update your lock file when you run npm install which is insane.


Npm hasn't done this in over a year.


The current version of npm does this and this is "correct behaviour". I got bitten by this a few weeks ago.

For the passers-by, the only way to make npm behave expectedly in this specific case is to use "npm ci" instead of "npm install". If you do not do this, npm will assume you want to update the packages to the latest version at all times, at all costs, even if you have a lock file in place, and even if you have your package file and lock file locked to exact versions. (i.e. 2.0.0 exact, not ^2.0.0)

This is a new addition, and it has been added a couple months ago. Before that, you had to check your dependencies into your source control. That might still be the best practice, and likely the only trustable way to get reproducible builds consistently over a longer time horizon.


> and even if you have your package file and lock file locked to exact versions. (i.e. 2.0.0 exact, not ^2.0.0)

Wait what? Are you sure about that part? That's a violation of npm's semver constraints https://semver.npmjs.com

(I agree with you that "npm ci" should be the default behavior, and "npm install" should be called something different, like update-and-install)


Yes - to be more specific, you can lock your own package's dependencies to an exact version, but you cannot lock dependencies of your package's dependencies. You can't do anything about them. They will get updated because their package lock specifies the lock in the form of ^2.0.0. The fact that a package lock can resolve to multiple versions is counterintuitive. One would think the whole point of a package lock is to lock packages.

As a result, when you do a npm install in your oblivious and happy life, npm naturally assumes you want to summon Cthulhu. If you didn't want to summon Cthulhu, why did you call the command that summons Cthulhu? Yes, the default command summons Cthulhu because we believe in agile Cthulhu. If you don't want to summon Cthulhu, try this undocumented command with a misleading name we've added silently a few weeks ago for weird people like you who don't want to summon Cthulhu when they want to do a npm install. But seriously, why do you not want to summon Cthulhu?

Unfortunately, this was the impression I've gotten of the position of npm folks when I read a few threads about this. I've moved to npm ci for now and moved on. Npm's package lock is many things, however, none of the things it is, is a package lock.


Or use Yarn.


…so would Cargo? If you install a new package, why wouldn’t you expect it to show up in your lock file?


No, you guys don't understand.. npm updates the package lock even when not adding a new package, i.e. the initial `npm install`. It's insane I'm think to go back to yarn again..


I'm with you, the default behavior is so counter intuitive.


You can use ‘npm ci’ for actually sensible install behaviour.


Hmm, that's pretty stupid. What is the rationale behind this? That you check before you run an install?


Why is that insane? What else is supposed to happen when you install a package?

EDIT: I misunderstood and thought you were talking about installing a package. If you're running `npm install` to just reinstall dependencies then yes the lockfile should not be modified. However it seems like that is indeed the case and you may be talking about a prior bug with NPM.


`npm install` is what you the developer would run when you first clone a project; it should install exactly what's in the package-lock.json file. Unfortunately, it sometimes doesn't do that.


Well just like many other languages with sane environment (dependencies, building, etc.) management. I think this is the norm nowadays (D, Clojure, and so on).


I think Poetry [1] is the most promising in the python build/dependency space. I've used pipenv and left dissatisfied.

[1] https://github.com/sdispater/poetry


I gave poetry a try and liked it a lot as well, but I really miss the virtualenv handling from pipenv. How do you handle virtualenvs with poetry?


That sounds pretty similar to NPM, as well as NuGet and Paket for .NET. TBH, it's the 'obvious' way for a package manager to work, so I'd be a little surprised if they didn't all work more or less the same?


You have to run npm ci instead of npm install to get npm to respect the lock file. I don’t consider that remotely obvious. And this feature was just added to npm last year, 8 years after npm was invented!


That is incorrect. Both `npm install` and `npm ci` respect the lock file, and if a lock file is present, will make the `node_modules` tree match the lock file exactly.

`npm ci` is optimized for a cold start, like on a CI server, where it's expected that `node_modules` will not be present. So, it doesn't bother looking in `node_modules` to see what's already installed. So, _in that cold start case_, it's faster, but if you have a mostly-full and up to date `node_modules` folder, then `npm install` may be faster, because it won't download things unnecessarily.

Another difference is that `npm ci` also won't work _without_ a `package-lock.json` file, which means it doesn't even bother to look at your `package.json` dependencies.


Thanks for the reply Isaac! This doesn’t match my first-hand experience unfortunately. Are there any circumstances under which npm install with a lockfile present deviates from the lockfile where npm ci does not?

For example, why did this person experience the changing lockfile? https://github.com/npm/npm/issues/17101

Or why do these docs say?

> Whenever you run npm install, npm generates or updates your package lock https://docs.npmjs.com/files/package-locks

Oh, this seems like what I experienced: https://stackoverflow.com/a/45566871/283398

It does appear that npm works somewhat differently than the “obvious” way we would expect package managers to work vis a vis lockfiles :(

At least npm ci gets the job done for my use case :)


If you run `npm install` with an argument, then you're saying "get me this thing, and update the lock file", so it'll do that. `npm install` with no argument will only add new things if they're required by package.json, and not already satisfied, or if they don't match the package-lock.json.

In the bug linked, they wanted to install a specific package (not matching what was in the lockfile), without updating the lockfile. That's what `--no-save` will do.

The SO link is from almost 2 years ago, and a whole major version back. So I honestly don't know. Maybe a bug that was fixed? If this is still a problem for you on the latest release, maybe take it up on https://npm.community or a GitHub issue?


> Both `npm install` and `npm ci` respect the lock file

This is not correct. `npm install` will update your dependencies, not install them, disregarding the package versions defined in the lock file.

It feels like you are not getting the point of having a lock file in the first place. It should be obvious that you can't do an install (which npm calls ci) if you don't have a lock file.

The lock file represents your actual dependencies. Package.json should only be used to explicitly update said dependencies.


If you run `npm install` with no arguments, and you have a lockfile, it will make the node_modules folder match the lockfile. Try it.

    $ json dependencies.esm < package.json
    ^3.2.5
    # package.json would allow any esm 3.x >=3.2.5
    
    $ npm ls esm
    tap@12.5.3 /Users/isaacs/dev/js/tap
    └── esm@3.2.5
    # currently have 3.2.5 installed
    
    $ npm view esm version
    3.2.10
    # latest version on the registry is 3.2.10
    
    $ npm install
    audited 590 packages in 1.515s
    found 0 vulnerabilities
    # npm install runs the audit, but updates nothing
    # already matches package-lock.json
    
    $ npm ls esm
    tap@12.5.3 /Users/isaacs/dev/js/tap
    └── esm@3.2.5
    
    # esm is still 3.2.5
    
    $ rm -rf node_modules/esm/
    # remove it from node_modules
    
    $ npm i
    added 1 package from 1 contributor and audited 590 packages in 1.647s
    found 0 vulnerabilities
    # it updated one package this time
    
    $ npm ls esm
    tap@12.5.3 /Users/isaacs/dev/js/tap
    └── esm@3.2.5
    # oh look, matches package-lock.json!  what do you know.
Now, if you do `npm install esm` or some other _explicit choice to pull in a package by name_, then yes, it'll update it, and update the package-lock.json as well. But that's not what we're talking about.

I often don't know what I'm talking about in general, but I do usually know what I'm talking about re npm.


Everything has evolved to get to that point. I suppose if you start with something modern like npm then it's not obvious how bad the earlier ones are. Compare the good ones with composer, dpkg, rpm, apt or dnf, to name a few examples.


Dub definitely does this (pretty much exactly the same, dub.json = Cargo.toml, dub.selections.json = Cargo.lock), and afaik cpan does something similar.


Python does have something like this, which is conda [0].

It allows specifying dependencies with much of the same freedom you mentioned, in an environment.yaml file and other config files, you can provide arbitrary build instructions via shell scripts, use a community led repository of feeds for stable and repeatable cross-platform builds of all libraries [1], generate the boilerplate automatically for many types of packages (not just Python) [2], compiled version specifics with build variants / "features", and you can use pip as the package installer inside a pip section in the conda yaml config file.

[0]: https://github.com/conda/conda [1]: http://conda-forge.org/#about [2]: https://conda.io/projects/conda-build/en/latest/source/resou...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: