Understand Social Media's Role in Democratic Backslide
It is one thing to say social media is bad, it is another thing to understand how social media is affecting our democracy and respond to it
It is quite fashionable among certain academic folks to stay away from social media saying “Oh, it is not for me”, “I’m not tech-savvy”, “I find it very toxic”, and various other reasons. While these reasons could all be true, if they are into subjects like sociology, politics, economics, or any humanities, there is absolutely no way for them to justify their ignorance of social media. Because the most important driver of information, misinformation, and disinformation in today’s democracies is the social media (including WhatsApp, Twitter(X), Facebook, and Instagram/Reddit).
I watched While We Watched yesterday and while it was sad and inspiring at the same time, it was focused exclusively on Ravish Kumar’s journey through the downfall of mainstream media. There is a larger story to be told on the downfall of democracy among internet-assisted media.
In this week’s Ann Friedman newsletter there are two very interesting links. One is to an account by Naomi Klein on the slide of her doppelganger into the world of conspiracies. Quoting from that:
This, obviously, is gonzo stuff, the kind of thing that makes those of us outside the Mirror World feel smug and superior. But here is the trouble: many of Wolf’s words, however untethered from reality, tap into something true. Because there is a lifelessness and anomie to modern cities, and it did deepen during the pandemic – there is a way in which many of us feel we are indeed becoming less alive, less present, lonelier. It’s not the vaccine that has done this; it’s the stress and the speed and the screens and the anxieties that are all byproducts of capitalism in its necro-techno phase. But if one side is calling this fine and normal and the other is calling it “inhuman”, it should not be surprising that the latter holds some powerful allure.
Then there’s the other, a study done by a BBC journalist — in which five profiles were created on social media and used to like, follow, and subscribe different kind of content as per the nature of the profile. And not suprisingly, these all turned out to be filter bubbles:
She hasn’t been surprised to see her right-leaning voters recommended content that supports Trump in the face of the indictments, and condemns them as a witch hunt, while left-leaning voters are shown content condemning Trump. Content about Joe Biden’s age has generally cut through the feeds of voters on both the left and right — but while on the right, content tends to portray him as “old and incompetent,” on the left she sees some praise of policies and (of course) the Dark Brandon meme (which has been co-opted by Biden’s team).
We have heard about filter bubbles from a long time. What’s quite surprising is that we’re not doing anything about these! We’re letting the world burn itself down where one section of society believes in a completely different truth than another section of society. And we just keep pouring oil into the fire. Where are the solutions? Who’s even bothered about the solutions?
There are different kind of things people try.
One set of people try to do fact checking. People like Mohammed Zubair and Alt News. While this is important, does the fact check reach across the filter bubbles? And then governments get into fact checking themselves and this is an even bigger problem in democracies.
Another set of people are fighting against hate speech directly. This is, in my opinion, slightly more valuable. It helps to dampen the fire by making it unacceptable to cross certain boundaries.
Yet another group of people (especially those “I’m above this shit” class), stay out of social media. This, I’m sure, helps nobody except those who’re staying out.
So, what more can we do? I’ve my own suggestions.
For individuals
- Switch to chronological feeds wherever possible. That would be “Following” feed on Twitter. Turn off watch history and go to subscriptions on YouTube. Tap the Instagram logo on home screen to switch there.
- Twitter allows private lists (at least now). You can use it to get chronological feeds from accounts that you curate. Create a private list for accounts from political opponents and see what is the narrative that happens there.
- Do not engage with toxic content and give it more reach.
- Follow and amplify content that you want to give reach. Be intentional!
For democracies
Regulate platform design
Many people scoff at the idea of holding platforms (intermediaries) accountable.
I think Elon Musk would give them ample reason to rethink. Platforms can become more toxic by small changes like amplifying certain kinds of posts. And vice versa, platforms can become more helpful to democracies if we force them to not make such changes.
For the unaware, the quote feature in twitter is a simple example of how platform design affects behaviour. Just think about it, you disagree with a post by amplifying it. And this leads to more reach for this form of content. How bad could the platform be to promote this kind of user behaviour?
We should be thinking about things like:
- human-to-algorithm recommendation ratio: In any timeline, the content that are chosen by me (by following people, etc) should be x times more than the content that the algorithms suggest for me.
- mandatory negative flags (downvotes): Any platform should include an option to suggest a content is problematic just as easy it is to amplify the content. For every “retweet” feature there should be a feature to “depromote”.
- Holding platforms to media standards if they’re doing content curation.
Above all, people need to seriously think about the impact of social media on democracy and how we can change those.