You are reading content from Scuttlebutt
@Christian Bundy %mWyghJajllFNbZb8UrZH1nME4/xemdectKntxnMAnpo=.sha256
Re: %Rm7Bo44sE

@juul @cel

I've been thinking about the robots.txt solution for a few days, and I think that limited web interactions is a good idea, but I wonder whether something like send.firefox.com would work for us.

Instead of the current "copy external link" mechanism in the clients I've used, it seems to me that we could instead have a viewer that shows its private messages to the world. For example, let's say Alice is a user, Bob is a viewer, and Carol is the recipient.

  1. Alice wants to share a cat photo with Carol.
  2. Alice sends a private message to Bob with the cat photo.
  3. Bob replies with a link to the private message on the web viewer.
  4. Alice sends the link to Carol.
  5. Carol opens the link in a web browser.
  6. Bob retrieves the content and sends it to Carol.
  7. After 5 downloads or 24 hours, Bob deletes the blob(s).

This means that Carol's view of the network is dependent on what Alice can see, not what Bob can see, and also means that randos can't search/scrape the network from the web. I'm not sure whether this is a good idea, but something with a similar shape has been banging around in my head for a few days. Regardless, it seems like there are only a few important things:

  1. Alice should be able to share content with Carol.
  2. Bob should be able to act as an intermediary.
  3. Randos shouldn't be able to scrape the whole network from the web.
Join Scuttlebutt now