Another example is that the foundation put together a task force ahead of the U.S. elections, again, trying to be very proactive. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there were only 33 reversions on the main U.S. election page was an example of how to be very focused on key topics where misinformation poses real risks.
Then another example that I just think is really cool is there’s a podcast called “The World According to Wikipedia.” And on one of the episodes, there’s a volunteer who is interviewed, and she really has made it her job to be one of the main watchers of the climate change pages.
We have tech that alerts these editors when changes are made to any of the pages so they can go see what the changes are. If there’s a risk that, actually, misinformation may be creeping in, there’s an opportunity to temporarily lock a page. Nobody wants to do that unless it’s absolutely necessary. The climate change example is useful because the talk pages behind that have massive debate. Our editor is saying: “Let’s have the debate. But this is a page I’m watching and monitoring carefully.”
One big debate that is currently happening on these social media platforms is this issue of the censorship of information. There are people who claim that biased views take precedence on these platforms and that more conservative views are taken down. As you think about how to handle these debates once you’re at the head of Wikipedia, how do you make judgment calls with this happening in the background?
For me, what’s been inspiring about this organization and these communities is that there are core pillars that were established on Day 1 in setting up Wikipedia. One of them is this idea of presenting information with a neutral point of view, and that neutrality requires understanding all sides and all perspectives.
It’s what I was saying earlier: Have the debates on talk pages on the side, but then come to an informed, documented, verifiable citable kind of conclusion on the articles. I think this is a core principle that, again, could potentially offer something to others to learn from.
Having come from a progressive organization fighting for women’s rights, have you thought much about misinformers weaponizing your background to say it may influence the calls you make about what is allowed on Wikipedia?
Article source: https://www.nytimes.com/2021/09/23/technology/wikipedia-misinformation.html