In the understandably horrified response to last week's livestreamed massacre of Muslims peacefully worshiping in New Zealand, a certain consensus has emerged that Big Tech's platforms need to police hateful content more aggressively.
Since the Christchurch shooter specifically used Facebook and YouTube during his crime, those two platforms in particular are getting flogged in the mainstream press. Take this recent discussion on MSNBC for example, including remarks about the platforms starting ~2 minutes in:
I don't necessarily disagree with a word in this entire video. Far be it from me to absolve Big Tech of cluelessly looking the other way on hate speech, to the detriment of society.
That said, I think the solution to this problem should go beyond just calling for the platforms to moderate and take down hate speech more aggressively. In particular, I'd say the two other shared characteristics of YouTube and Facebook listed below also affect how they handle hate – one for obvious reasons, the other perhaps not so obvious. These issues should be addressed as well with Big Tech:
1) Their workforces have a glaring, persistent lack of diversity. Perhaps if these companies had more people of color, women, and other underrepresented groups as employees, they'd be more prescient and attentive about white supremacist threats like the New Zealand shooter.
2) They're virtual monopolies. YouTube essentially owns user-generated video, and Facebook owns social networking in general, spread across its flagship platform, Messenger, Instagram, and WhatsApp. When Mark Zuckerberg testified to Congress last year that the average mobile user has eight messaging apps installed on their phone, he neglected to mention that his company owns four of them.
This is obviously an antitrust issue, as governments are increasingly realizing. But in terms of hate, we should also recognize that tech concentration makes it easier for bad guys to propagate their poison quickly to the public in one convenient place.
To be sure, if the audience's attention were spread around more widely online, supremacist miscreants would still find a way to cluster somewhere. It's notable that the New Zealand shooter also posted a manifesto on 8chan – essentially a hive of hateful trolls – prior to last week's tragedy, for instance.
But a more competitive market for content distribution, with less of a role for the YouTubes and Facebooks, would make it harder for supremacists to terrorize the rest of us merely by distributing their propaganda in just one or two major company-controlled spaces.
Look, I'm all for getting the big platforms to use their considerable power more wisely to combat hate online. But I also see that as a short-term fix. While we're at it, let's also remember to address the bigger question of whether they deserve such power in the first place.
Thanks for spending some time with Indizr today. For more regular updates about Web 3.0, subscribe to our email newsletter.