@ev in #lifewithoutnpm

Imagine what would happen if npm went down tonight, and never came back up.

The npm apocolypse finally happens.

What will you do?

There are all sorts of reason why this might happen. npm might run out of funding, decide they don't want to do it anymore, or maybe they just unpublish your favorite module because someone doesn't 'like' it anymore.

While there's nothing inherently wrong with npm, and it's got us this far, centralization is not resiliant.

Many folks have brought up that npm may be the weakest link in the Node ecosystem. I agree.

I had this crazy idea tonight, why not just stop using npm?

So I just went and sudo pacman -Rns npmed. It's gone. Adios npm!.

I'm alone in the universe, without npm. Of course I still have git, pacman, and the Internet.

2016-08-28-224125_1366x768_scrot.png

This is a partial view of the node_modules folder for Patchbay. It's a lot. Some of the modules I recognize, many of them I don't. It shouldn't be too hard to update, but how will I figure out how to bring all of this down again without npm?

I guess I'm going to have to figure that out.

The goal is one month, September 2016, without npm. If I can't figure it out, I'll go back.

Does anyone have ideas about how to accomplish this? Is anyone already doing this?

@Du5t in #lifewithoutnpm
  1. string-replace all of the version strings with github URLs with the suffix @<release_version_you_want>. now you depend on github the same way you depend on npm, which might be dicey if you're in russia.
    • you need to do this recursively as npm walks the dependency tree. probably it is best to flatten the dependency tree by pulling in all of the sub-dependencies into your package.json (wow such transparency).
  2. write a script that pushes all of the git repos you need to ssb-git as you fetch them. now you've caused us to subsidize all of the effort expended by CDNs.
  3. profit

i joke, you can spread things out by mirroring repos at gitorious and gitlab et cetera. you need to write a bunch of jobs that do this regularly with each git repo tho.

@celly in #lifewithoutnpm

string-replace all of the version strings with github URLs with the suffix @<release_version_you_want>.

could also replace the version strings with URLs to tarballs hosted at ipfs.io, localhost:something, etc

@Dommy in #lifewithoutnpm

lol, well when San Francisco is the epicenter of a zombie outbreak,
github is gonna be a problem too.

@ev I have npm set up to use always use the cache unless I say.

npm config set cache-min 999999999

then it doesn't go online unless it must.
if you want to force an update, use

npm install ... --cache-min 0

Imagining how @cel would fix npm, is maybe something like a script to insert modules published over ssb into the local cache, then they can be installed with the standard npm client.

@kemitchell in #lifewithoutnpm

It's fairly common to mirror the npm public registry. Hence all those many unused packages with >0 downloads on npmjs.com. Not just companies do it! You can fit all the public tarballs on one external hard drive.

As for the meta, npm recently set up an additional, stable, authenticated CouchDB-style replication server at replicate.npmjs.com. You can search for "npm follower tutorial" to get a taste. I'm experimenting with precomputing flat package trees here. So far, so good.

Full disclosure: npm's a client. I can't talk about specific takedown requests or policies that are still rolling around internally. But the good news is a great deal of what there is to know is online anyway: on the blog, in the community policies, and in CLI GitHub issues. And it's still a really small company.

As for changes, they are very much on-board with CAS as a strategy, though Forrest's been pretty tied up dealing with an FS bug on Node 6 of late. As for distro, they lean heavily on a CDN ... Fastly, IIRC. The CLI team's been very supportive of experiments with IPFS and other systems, but it's hard to make an argument for the perf effects on "typical" users. I've heard something about dep management via Nix, too, but haven't looked into it yet.

Hacking package systems is loads of fun. Just don't read too much about CPAN. Spoils all the surprises ;-P

@the_linker in #lifewithoutnpm

I would love to test this with all of you...!!

I had to deal with a lot of broken/questionable networks recently and a couple of times it stopped me from doing maintainance/reinstalls of my npm packages... :-/

@ev in #lifewithoutnpm

@kemitchell Thanks for weighing in on this. I'm not sure that I want to mirror npm for my experiment.

I've tried the tool someone (mixu?) built to catch all of the modules you download from npm and cache them locally when I had really bad Mexican Internet one time, that isn't quite what I want either.

Mirroring all of npm seems to be a lot of data, and I only want to deal with the modules that I actually use -- which as you mentioned is a tiny fraction of what is actually on npm.

Most of all I don't want to depend on npm because what if they go away someday? Plus the point of my experiment is try to live without npm!

@ev in #lifewithoutnpm

@cel I have thought and in some cases experimented with using IPFS for this, but I kept running into roadblocks that weren't supposed to be happening with IPFS.

My idea with IPFS was just to ipfs add -r node_modules, get the hash, and then bring it down again. However, everything went off the rails when IPFS refused to bring files down to my computer from my VPS.

From what I can tell IPFS either didn't, or doesn't, handle merkle dags over a certain size.

IPFS would be ideal for this, because merkle dags can be the same hash if created by two different people. This means you can merkle dag your node_modules folder and if you have the same modules as someone else, you can both host the same file even if you're not working on the same project.

I just wished it worked though... I can try it again and see IPFS has fixed whatever the issue was, it's been a month or two.

@ev in #lifewithoutnpm

Ok, I'm trying this with the latest version of IPFS. Because if it worked that'd be very cool.

If you want to get ALL of the node_modules for evbogue.com try

% ipfs get QmaNqEZ5P7bAvh9nyL8Ka8CUsGWKtg1MFHw4kP5542cJab

I merkle-dagged them on my server, so they should be available on your network in an ideal world. And from mars of course.

@ev in #lifewithoutnpm

ipfs.io works...

http://ipfs.io/ipfs/QmaNqEZ5P7bAvh9nyL8Ka8CUsGWKtg1MFHw4kP5542cJab

@ev in #lifewithoutnpm

This is a good sign, on my local

Saving file(s) to QmaNqEZ5P7bAvh9nyL8Ka8CUsGWKtg1MFHw4kP5542cJab
 2.00 MB / 28.16 MB [===>----------------------------------------------------]   7.10% 11m47s
@ev in #lifewithoutnpm

But then it just gets stuck at

Saving file(s) to QmaNqEZ5P7bAvh9nyL8Ka8CUsGWKtg1MFHw4kP5542cJab
 5.00 MB / 28.16 MB [==========>-------------------------------------------------]  17.75% 0s

Forever and never finishes. I guess IPFS only lets my server upload 5mb and then cuts me off?

@ev in #lifewithoutnpm

Ok, looks as if it might finish

26.00 MB / 28.16 MB [===================================================>----]  92.32% 2m59s

When all is said and done it took more than an hour to download 28.16 megabytes over IPFS.

@ev in #lifewithoutnpm

Uh oh, this isn't good! Lol.

31.00 MB / 28.16 MB [=======================================================] 110.07% -3m57s
@the_linker in #lifewithoutnpm

Forever and never finishes. I guess IPFS only lets my server upload 5mb and then cuts me off?

This is the rate limiting I complained about in %fy97NuN.... If only one peer is the data it's cumbersome to get out.. sometimes I trick it by requesting from other peers (public gws) but most of the time it's to tiresome.

@ev in #lifewithoutnpm

@cryptix I'm happy to say that it finished eventually, however it did seem to stall for 15-25 minutes every 2 megabytes.

Restarting my IPFS daemon on my VPS may have helped it complete faster?

I can't figure out if this is a fundamental flaw with IPFS or if it's a temporary hurdle they're trying to overcome. I hope that it's temporary...

@kemitchell in #lifewithoutnpm

@ev Thanks for your note. I'm a fan of your experiment!

One approach you might consider, which a few folks (including me) have played around with, is "flattening" out your dependency tree, adjacency list style. @mikola first turned me onto this!

Instead of:

node_modules/
  a/
    package.json for 1.0.0
    index.js
    node_modules/
      b/
        package.json for 2.0.0
        index.js

Try:

node_modules/
  a@1.0.0/
    package.json
    index.js
    node_modules/
      b --(symlink)--> ../../b@1.0.0
  b@1.0.0/
    package.json
    index.js

I've heard a rumor that ied does something similar, but can't confirm.

The approach has a few nice properties:

  1. node_modules is as little-nested as possible.
  2. It's fairly easy to clone or unpack modules into the system.
  3. I find it much, much easier to reason about.

There are also some side effects.

  1. If you symlink each package to the latest versions that match its dependencies, and update symlinks as you discover newer versions of its dependencies, you can always point to the same directory for one package@version. No need to install a package (or its dependencies) many times for many projects.
  2. However, if you follow that approach, you may end up with more versions of a single dependency than npm install would provide. That can make waste in browser bundles and require delays in large dep graphs.

I'm sure there are more surprises. I'm learning as I go.

@Dommy in #lifewithoutnpm

I've dicussed alternative architectures for a faster, more replicated npm with CJ (who is now npm CTO) and it sounds like they are now actually about to build something like what we talked about.

there are now several alternative npm clients, but this one is mine.

https://github.com/dominictarr/npmd

@celly in #lifewithoutnpm

@dominic
nice tip on the cache - i am going to try using that.

Imagining how @cel would fix npm, is maybe something like a script to insert modules published over ssb into the local cache, then they can be installed with the standard npm client.

that's an interesting idea. i was thinking instead to have a wrapper command for the npm client that runs an ephemeral local registry server and then calls the client telling it to use that server. that would allow for making all the npm commands work. then implement the registry api backed by ssb.

looking at the npm source now... maybe it would be easier to try to hack npm to run while replacing it's npm-registry-client dependency with the custom registry implementation

@Dommy in #lifewithoutnpm

@cel the only tricky bit is to remember that it's enabled and that when install can't find a required version, you need to turn it off for that install with --cache-min 0

This would be much easier if npm had a way to represent mulitple registries in a way that was better than just a http or git url.

I think it might work out cleaner to have a ssbpm, that could be federated to npm, so you could install scuttlebot via either, but it would be ssb first, and the npm publish would be generated from that. a ssb-npm is gonna need to keep feed ids & message hashes in the package.json.

I'm not sure what the simplest way to start would be. If we made a system that installed node_modules that was purely ssb, we could shift everything we need into it. But to put everything on npm onto ssb is a different question, just because npm is getting really quite large now.

@ev in #lifewithoutnpm

I don't mind having access to only the necessary node modules that I need, so ssbpm makes a lot of sense to me.

@slothbag in #lifewithoutnpm

Have you looked into https://github.com/whyrusleeping/gx ? It's an extensible package manager written in Go that uses IPFS under-the-hood.

Rather than creating a single MerkleDAG with the entire NPM repo in it, devs would create their own gx packages for each of their modules, they can then add dependencies to other gx packages from other devs. gx can then resolve dependencies and download only what you need.

Its used primarily for IPFS development, but you can extend it to support any language. I'm using it for C# and to replace Nuget and its centralized repo.

@Dommy in #lifewithoutnpm

@slothbag how does it handle semver? it looks like you have to work with hashes?

@slothbag in #lifewithoutnpm

The readme says 'Built-in semantic versioning' so I guess its supported somehow. I've primarily just been using direct hashes and updating packages as I need to.

@slothbag in #lifewithoutnpm

My project has many mixed dependencies like compiled c# binaries, and js libs etc. I can install them all with a single 'gx i' command, and the hooks will apply for each different language. So c# dll's might automatically be referenced in my solution, js files might automatically be uglified on any install/update command.

@ev in #lifewithoutnpm

@slothbag I've researched gx, but haven't quite been able to grok it.

What is your approach for js dependencies?

@slothbag in #lifewithoutnpm

I''ll copy a js module either via git or npm, create a custom gx package.json for it with the same version string as the original and then publish it. Then I can reference it from any of my projects on any machine anywhere (as long as they have access to IPFS)..

Obviously if/when the jquery team start to publish their own gx package i'll reference that instead :)

Here's my package for jquery for example https://ipfs.io/ipfs/QmdVSi27zJ8dLs6Yf272cYJxKvmk3vBwqWGrMqiXEgaGL1

@ev in #lifewithoutnpm

@slothbag How do you reference a non-go object such as this in gx? How would you put it into a node_modules folder? I think that's where I got stuck, but it's been awhile.

@slothbag in #lifewithoutnpm

gx creates its own datastore/folder for all its packages, so I guess you could move away from node_modules and use the gx folder instead.

But you could easily create a gx-js hook that copies the contents into node_modules very easily.

Here's a starting point: https://github.com/sterpe/gx-js

@bret in #lifewithoutnpm

an {ssb,dat,ipfs}pm would be fantastic. Just as long as it doesn't fill my pubs hard disk imediately.

@ev in #lifewithoutnpm

This issue may be resolved

@kek 🐸 in #lifewithoutnpm

Just weighing in on the IPFS side of things: There is this problem that the blob size is limited. For files, clever chunking algorithms have been found and are in use, but not yet for directories.

This feature is required to be able to put all those modules in a single folder. Actually people trying to mirror Wikipedia on IPFS ran into the same problem (idk, maybe they are the ones who files the first bugs on that?)

Here is the relevant issue: https://github.com/ipfs/go-ipfs/pull/3042

There has been quite some work on that. I believe it will ship in the next realease or the release after that.