Opening a Can of Worms

Yes, it's another 10Cv5-related blog post. This time I get to ramble incoherently about one of the first functions that was coded into the API but has not yet appeared in the beta due to the can of worms that was opened when the feature was first written, which ultimately resulted in me choosing to create v5 rather than building on top of v4: following websites.

One of the first thoughts that I had when reading through a lot of the IndieWeb documentation was something like: "This would all be a whole lot simpler if our RSS readers were also our blogging tools". With that thought firmly implanted, I started sketching out how an RSS reader might look if it also had facilities like you'd find in a blogging application like MarsEdit. Images of late-90s-era Netscape UIs bounced around in my head, where single applications tried really hard to do five things but accomplished none. A reading application is no more a place to write than a novel … right?

The more I thought about this the more I saw the RSS reader as being more a part of the social elements than any other. A syndication feed of "informal posts" like one would find on any modern social network is not that different from an RSS application where every update is shown in the same scrollable view. This is essentially how the big social networks operate today, with personal items from friends and family being broken up by a blog post from a website we're interested in. On App.Net, a person could have syndication feeds from websites appear in the timeline by creating an account, hooking up an RSS feed via PourOver, and following that "bot" account. This works to a certain degree, but it annoying as heck for anyone who wants to follow a bunch of websites. As someone who has created social networks, having a bunch of bot accounts is a lose-lose proposition. These things consume far more resources than even the most expressive fans of the platform and give nothing of value back to the people who use the system. Oh, sure, many people can follow an account named @BBC for posts from various BBC feeds, but there's no interaction possible beyond "comments to nowhere". Then there's the problem of "who owns the ever-growing collection of content in the database?" Is it the person who created the syndication account? Is it the website owner? How can a website owner even know that someone on a tiny social network like 10C created an account in their name?

Looking at the notebook where I first started sketching out how a feature like this might work, I asked 14 hard questions about content ownership, ancillary distribution, and phantom archival. At the end of the day, I did not want these bot accounts on 10C. The risk of copyright problems was just too great, and this was before the recent EU legislation on copyright was passed into law.

Having RSS feeds from external sites appear in our timelines is an attractive idea, though. So why not give people the ability to "follow" websites? The RSS feed would be read on a per-account basis and kept only for that account. As the RSS feed cycled, older items would fall off the end and newer items would appear at the front. Nothing would be saved in the database and, if a website disappeared from the Internet or otherwise stopped distributing their content via the old standards, the information held by 10C would naturally expire and disappear. Content ownership would continue to to be 100% controlled by the content owner, as it should be. As these posts would appear in a social timeline, they could be commented on. IndieWeb standards makes it possible to send a WebMention to the content owner's site, letting them know that someone has written a comment. If that site follows IndieWeb standards, a handshake will take place and the comment will be recognized. The 10C database would hold onto my comment, along with a link to the source content that I commented on, but not the other person's post data, as it does not belong to me.

This is how people using self-hosted versions of 10C will be able to communicate with each other in a decentralized fashion, and this is how people using 10C will be able to communicate with people using a self-hosted WordPress blog, or micro.blog, or TheGuardian.co.uk … if they were to ever embrace some IndieWeb interactivity.

The can of worms is still very much open, and there are a number of issues that still need to be handled, such as parsing really bad XML and JSON syndication feeds. That said, the technical hurdles are always the easiest to overcome. The complexity is always in how people respond and react. 10C's approach to RSS is rather conservative and perhaps a bit over-cautious, but it may be the most logical way forward given how governments around the world are slowly closing off the web.

This feature will be going live on the v5 Social Beta site in the next few days, and it will be interesting to see how well it operates when there is more than a single person using it.