If it’s for “fun,” is this version of the PizzaGate conspiracy harmless?
It’s not. We’ve seen over and over that some people can get so far into conspiracies that they take them seriously and commit real-world harm. And for people who are survivors of sexual abuse, it can be painful to see people talking about it all over social media.
Have the internet companies gotten better at stopping false conspiracies like this?
They have, but people who want to spread conspiracies are figuring out workarounds. Facebook banned the PizzaGate hashtag, for example, but the hashtag is not banned on Instagram, even though it’s owned by Facebook. People also migrated to private groups where Facebook has less visibility into what’s going on.
Tech companies’ automated recommendation systems also can suck people further into false ideas. I recently tried to join Facebook QAnon conspiracy groups, and Facebook immediately recommended I join PizzaGate groups, too. On TikTok, what you see is largely decided by computer recommendations. So I watched one video about PizzaGate, and the next videos I saw in the app were all about PizzaGate.
TikTok is a relatively new place where conspiracies can spread. What is it doing to address this?
TikTok is not proactively going out and looking for videos with potentially false and dangerous ideas and removing them. There were more than 80 million views of TikTok videos with PizzaGate-related hashtags.
The New York Times reached out to TikTok about the videos, pointing out their spike. After we sent our email, TikTok removed many of the videos and seemed to limit their spread. Facebook and Twitter often do this, too — they frequently remove content only after journalists reach out and point it out.
Article source: https://www.nytimes.com/2020/06/29/technology/pizzagate-tiktok.html