@Lucas I'm think we should definitely be exploring this. some thoughts:
Implicit vs. Explicit: we can interpret something like twitter or fb or the web as a "reputation network", but most people don't actually think about that while they use it. So, just create an implicit reputation network, people just use it because it works for them, you don't need to pitch them on "reputation network".
For example, google traverses the links on all the websites they find, analyzes that graph and uses them to determine a ranking for all websites. That ranking score can be interpreted as "reputation". The authors of those web links may not be thinking "here is a reputable site" or "I want to increase this website's score" but you can interpret this as an expression of trust, not because of the link but more because of the sites they choose not to link to. Linking is an action that expresses trust (involves a risk), not just a statement "I trust X", which is pretty cheap.
Another really interesting thing about page rank, is the graph represents a mathematical model of how humans "surf" the internet. refresher: before search engines where any good, you'd just click around from links to links, go to some site, follow it's links, etc. pagerank simulated this, calculating the probability that if you'd land on a given page if you spent your life just randomly clicking on links. I think the key here is that pagerank is based on a model of what humans already do, it just makes it faster.
Gameability - how easy is it to create the appearance of trust/value?
systems like ebay have "feedback" which seems easy, and works in the case of ebay because ebay charges a transaction fee which would make it expensive to create a fake reputation. For example: I could create a bunch of ebay accounts and make my sock puppet accounts "buy" from me, then give me good feedback, so it would appear I made many transactions. However, even though I don't post the goods (to myself) I'd still have to pay the ebay transaction fee! So if you see someone on ebay who has made thousands of transactions, the transaction fees for those may add up to more than they could make if they ripped me off one one transaction, so I can be pretty confidant in making that purchase.
In a decentralized system like ssb, an identity is just a key pair, and I could generate thousands of those a second, it would be really easy to create sock puppets that appeared to "trust" me. Any realistic reputation system would need to have a way to rule this out - a straight sum of all the feedback probably wouldn't work, but something that traced a social graph might.