I feel like the more likely risk is that the lies and misinformation will continue to be spread mainly on existing networks and through a wide range of different commentators, rather than with one single network or leader.
Losing Trump as the lightning rod may have some impact, and some of the recent actions by social media sites may have some impact, but I’m not sure they’ll be enough to kill it off. I suspect it’ll be like a hydra and new heads will just keep springing up.
Hopefully it’ll just weaken and splinter over time to the point where it’s no longer easily accessible by many people.
But, to me at least, it’s kinda unbelievable how far and wide things like Qanon have spread. Some of it must have spread through Trump’s 100M ish followers on Twitter, but that would have just been the entry point, people must have been getting the detailed stuff from other sources.
Some of the more obvious central points have been blocked recently (such as reddit, etc…), but I suspect that with the way social media site recommendations work, if you’re the right kind of person you won’t even have to look too far to find it. The platform will surface like minded content for you.
For example, last year I was watching a fair number of Star Wars reviews and opinions on Youtube. One of the ones I watched started reasonably, before descending into pretty right wing ‘SJWs ruined star wars!’ stuff. But once I’d seen that one (which I didn’t even finish), I had about a month of Youtube recommending me questionable looking content from the same channel and similar channels.
So anyone who’s watching Fox News or similar content on youtube, or other similar sites, is likely to get exposed to similar content, and then conspiracy content, and then more extreme views.
My feeling is that the way social media recommendation algorithms work can be very similar to the way indoctrination works. It can start small and innocently, and then some small part can appeal, and then it’ll add similar things that reinforce that part, and then it’ll add more extreme views, and then it’ll reinforce those, and so on.
At least with something like Parler, people had to actively go out and find it and install it, so it didn’t have the same risk of people being gradually indoctrinated into that world. If it wasn’t for the fact it was being used to actively plan violence, I’d have been kinda happy to let them have it as it would have kept them separated.
I see that social media platforms are getting some flak for banning Trump, especially Twitter. A couple of UK politicians criticized it, Angela Merkel criticized it, etc…
It is a thorny issue.
I think they made the right decision, and in fact I think they should have done it earlier.
I also think it’s a bit weird that people consider Trump losing his Twitter account to be somehow silencing him. He’s the frickin President of the USA, if he wants to make public statements he has many ways to do that. He has an entire department to do that. He has an entire press corps to do that. Previous US presidents have managed to make statements without resorting to Twitter!
But I do get why people are worried, as tech companies do have more and more influence and control over elements of our lives, and their decisions can have big impacts.
I also have some sympathy for them, given that they’re damned whatever they do. They’re expected to follow the laws in the west, but criticised if they follow the laws in authoritarian countries.
They’ve made some good decisions and some terrible ones.
Personally, I think there needs to be some kind of independent group that sets guidelines for how to handle these kinds of issues.
We have global bodies like the WHO, or we have working groups like the W3C and we need something like that to set guidance on social media handling.
That said, I personally think that that body would also need to make rules on how the surfacing algorithms work. There needs to be some requirement to provide a mix of content. If someone watches a video by a Fox News pundit, they shouldn’t only get recommendations for similar content that similar people have watched, there should also be a minimum of 30% of content from widely differing sources.
The media companies themselves have no financial incentive to do something like this as they want to keep users engaged and the best way to do that is to keep giving them what they want. But always getting only what you want is rarely good for you.