The Trouble With Social Media

Social platforms offer a quick and easy way for sports and social groups to set up web presences without any technical expertise, leveraging platforms that your members and users are often already using.

Facebook Pages offer a quick and easy way to advertise your club’s activities and offers free and highly functional photo and video hosting. Facebook Groups effectively offer a forum facility and extend that rich media hosting capability to include documents.

What more could you ask for?

The problem of course comes when a large platform needs to moderate its content. When ne’er-do-wells start using Private Groups or closed spaces to sell contraband (such as narcotics or black market firearms), or to distribute filth such as Child Pornography.

With 2.2billion users, it is inevitable that trying to moderate such content will have to involve more than a modicum of automation. For what it’s worth, Facebook is not the only platform wrangling with this - Twitter struggles unendingly (and publicly) with trying to decide what constitutes hate speech in different countries whilst Tumblr recently banned all “adult” content (and some 25% of their userbase), having concluded that it was far too difficult to programatically distinguish regular smut from child abuse. Whatever your feelings on “blue” content, the fact is that such content is in itself legal. Moreover, Tumblr had become a central point for the LGBTQ community, who felt more than a little aggrieved at their excommunication.

It’s not hard to see how the legitimate target shooting community can run into similar challenges when a platform finds itself trying to weed out illicit arms dealing or impose child-friendly filtering on “Weapons”, often conflating the topic with “Violence”. It’s the Scunthorpe problem all over again. Don’t let the talk of AI and advanced heuristics fool you. Filtering content en-masse is naive, dumb and still in its infancy as a discipline.

And to compound the issue, it is easy to lose sight of the fact that these ubiquitous platforms are in fact private companies. Offering a service for free, they are at liberty to set any lawful Terms of Service. It’s very easy to feel aggrieved at being excluded from a widely used platform. But it’s their toy, and they can take it home whenever they like.

Human review will see common sense prevail. Right?

Well, you’d like to think so. Unfortunately no. Facebook and Twitter won’t Verify the NRA of Great Britain, despite the fact that it is a bona fide Sports Governing Body. By contrast, their US political namesake is blue-ticked. You would think that verifying obviously-legitimate organisations would make their life easier in the long run, but they don’t see things that way.

The real recurring issues seem to lie however with Groups and “Community” functions. Facebook Pages for shooting-related organisations are rarely challenged, but Groups (particularly Closed Groups) are frequently hit - presumably because the algorithms struggle to differentiate discussion of legal and illegal activity. Over the New Year, Comber Rifle Club saw their Facebook Group suspended, although it was later reinstated. Not something you would want to rely on then.

So, Social is more trouble than they’re worth?

Social Media is in puberty. Myspace, Facebook and Bebo launched in 2003, 2004 and 2005 respectively, putting them all in their early, troublesome teens. We’ve gone through the innocent youth of start-up and are now watching the difficult process of the (surviving) platforms maturing to deal with the real world. Working out how to comply with disparate legal environments around the world on an internet which transcends borders, and of course with Californian bases, we see them founded and informed by a particular world view. What would Facebook look like if it was run by Indians, Czechs or Finns?

And here perhaps is the problem - not one of evil corporations. The issue is not that one company is bad and another good (or at least less bad). It is a problem of centralisation - of control and of values. Some have proposed moves to platforms like MeWe, which promise a privacy-first approach. But MeWe is still just a proprietary platform run by a company in California. They talk a good talk as far as privacy goes, but if they ever get traction to Tumblr or Facebook scales, they will undoubtedly start to run into the same problems of filtering and content moderation. It’s entirely possible that they could handle it better and more even-handedly that Facebook and Twitter, but Sports Shooters will inevitably be swept up in automated reports and flags as innocent victims.

This doesn’t mean that such platforms are not useful, but it seems unlikely that shooters can ever bet the farm on a proprietary platform. It’s too easy for a minor change in the T&Cs to result in communities being left high and dry.

It would be silly to say that there is no value in having a Facebook Page for your Club - Facebook remains ubiquitous. But that page should be considered expendable - backing up and driving traffic to a club website that you control, and easily replaced or disposed of should a new platform overtake Facebook.

Federation

One strong possibility is that federated networks will come to dominate. Federated Social Networks hark back to the origins of the web - a decentralised service which anyone can join. Communities can cluster together, whilst being able to communicate with other communities, like the IRC networks of yore. This is far more powerful than traditional web forums built on vBulletin or phpBB (which are “siloed” and can’t talk to one another) and usually offers better media sharing - something that traditional forums are notoriously bad at. Communities can interact (or not), whilst retaining control of their local environment. Even if a server or community is blacklisted by others in the federation, the service remains up for its local users. Diaspora represents an early attempt to replicate many elements of Facebook, but built it’s own protocol ground-up.

Mastodon and Pleroma are Twitter-a-likes, but use the ActivityPub standard, meaning that a Mastodon instance can interact not only with other Mastodon servers, but also (in a restricted fashion) with other services using the ActivityPub standard such as Pleroma. These include messaging systems as well as applications like PixelFed (an Instagram-a-like image-sharing system) and PeerTube (a self-hosted video-sharing/YouTube replacement).

The way forward

There is a rapidly emerging imperative for clubs not to rely on a centralised third-party service. If you don’t control your store-front, you can’t control your public image. If you don’t have a website, you probably should.

There is also a case to be made for our Governing Bodies to take a lead. One problem with Mastodon for instance is that with no central governance, there is no concept of blue-ticks or validated accounts. A private NRA-run Mastodon instance could host accounts for affiliated clubs, effectively accrediting that “comberrc@nra.org.uk” is the “legit” account in the Fediverse for Comber Rifle Club (for instance).

Individual users would have to pick a server to join, whether that’s a generic public servers like @mastodon.social, perhaps an @shooting.uk server or indeed their own private server if they were feeling keen.