You are reading content from Scuttlebutt
@cel %Bt/X5dbcP0XIJZkMteI7P7H0vw5X+9YMm9OnetOSoTI=.sha256
Re: %VU1C/HpB0
Content warning: thought process

@mixmix %VU1C/Hp... %iyzqSg4...

Currently, viewer.scuttlebot.io (ssb-viewer) and git.scuttlebot.io (git-ssb-web) serve blob requests (with or without URL-encoding). There could be added another subdomain just for blobs. (side question: for missing blobs, should it want them or return 404?)

For the installer, I wonder then how the installer script picks which host to fetch subsequent blobs from. &e6TJkl1... is hard-coded to use localhost:8989. &ErUlQYe... lets it be overridden with a env var $BLOBS_PREFIX (but it doesn't verify the hash of the received data).
So you can use the existing script like this (trusting the remote server to send the correct blobs):

export BLOBS_PREFIX='https://viewer.scuttlebot.io'
curl -s "$BLOBS_PREFIX/&ErUlQYegJprhEZUNO/HwHQ2UES+XUpT1XIb27GTGjT0=.sha256" | sh

I experimented with trying to make the script be able to detect the URL it is being fetched from, using pgrep -fa curl, but that did not work: curl already exited by the time the script is running. I don't suppose there is a way to recover the exited process's name from the file descriptor. So it seems the thing to do is hard code a server URL(s). Maybe take some input from the environment, or something in ~/.ssb… or maybe not.

It looks to me then that this has to be a separate installer script from the localhost one. Because if fetching via localhost:8989 is viable, I think that should be used. But I don't see a way for the script to make that decision.

Join Scuttlebutt now