Parler Insists It Would Not Knowingly Tolerate Criminal Activity On Its Site
STEVE INSKEEP, HOST:
Parler is still seeking a way back onto the Internet. The company promoted itself as a free-speech site. It attracted a mixture of Republicans, self-identified conservatives, extremists and racists who found Twitter and Facebook too restrictive. Numerous media reports revealed Parler users explicitly calling for violence and then posting videos as they participated in last week's attack on the Capitol. Amazon and other services then cut ties with Parler, which sued. So what is or was the company's responsibility? Parler Chief Policy Officer Amy Peikoff insists free expression was the goal.
AMY PEIKOFF: We are trying to allow for maximum freedom of expression consistent with the law.
INSKEEP: Parler also promised privacy, saying it would not analyze users' posts. Noted conservative figures were among the company's investors. And when President Trump was knocked off Twitter last week, some of his supporters called on people to migrate to Parler. But that site itself was on the way to being shut down, accused of failing to moderate extremist content. Peikoff maintains that all social media companies were flooded with calls to violence in recent months.
What responsibility, if any, does Parler take for the content on your site?
PEIKOFF: Our community guidelines were clear that we would not knowingly tolerate criminal activity on the site. We were trying to avoid using a system in which we would scan every piece of content with an automated algorithm. And so what we had was a community jury system in which any person on Parler could, of course, report a piece of content. We had a reporting mechanism. The report goes into a jury portal. And we had a, you know, bunch of volunteer jurors who were adjudicating these cases. And then the verdicts would come down, and the content would get removed as appropriate.
INSKEEP: We should be frank that a lot of people migrated to Parler because they felt they could not lie as freely as they wanted to on the other social media platforms. What do you think about that?
PEIKOFF: I wouldn't put it that way. I wouldn't put it that way. Not because they said they want to lie. Now, maybe there are some people. Of course, we've had some people come over who were bad actors and then would just tell all kinds of lies and everything else. I think everybody's got that. But people came over - some of them, not all of them - but some of them came over because they thought that they were being treated disproportionately unfairly on other sites and then, yeah, did come over.
INSKEEP: Do you take as a company any responsibility for not just calls for violence but just obvious inciting lies about a stolen election?
PEIKOFF: No. You know, I don't think that lies in and of themselves are inciting, right? So within a certain context, you could say that certain lies are. We could talk about, for example, whether that one speech that President Trump gave while the events at the Capitol were still going on and, you know, whereas he ended the speech with, go home in peace, but a lot of us found it not very convincing, given all of the preamble at the beginning. You could say, OK, in that context, he's telling certain lies and that that could be seen as a further incitement, given the ongoing activities on the ground at the day. So I see what you're getting at. But in terms of just lies themselves - and can you say that lies themselves are inciting the real world? No. When you're dealing with misinformation, we think the best anecdote is more information.
INSKEEP: In Amazon's response to your company's lawsuit, they quote a number of posts on Parler. And I'm reluctant to quote them here, for - I just don't care to spread the messages. But they are obvious lies about a stolen election. There are specific calls to violence, calls for a civil war starting on Inauguration Day, urging people to form militias, urging people to, quote, "shoot the police," urging people to hang specific public officials. This is just a partial list. When you read that list and know that it came across to people on your company, what do you think about?
PEIKOFF: I mean, I don't want it there, obviously. But again, the question is, what mechanism do you use to detect and then remove that content? And the model that we had, as I said, as time went on through November and into December, when we started making these changes, we realized that we need to do more. And we were making those changes, and we were in discussions up with Amazon. You know, they dropped this on us on Friday afternoon. And we were telling them what we were doing and that we were willing to do more. We were starting, even, to program a bit of AI to figure out how we could use AI consistent with our mission over the weekend. And we had started tagging some content that way.
So we're definitely amenable to this. Nobody wants us on their platform. There is plenty of this content. Or at least there was on Facebook and Twitter, as well. All of them, as - you know, and I've heard from ex-policy people from other platforms saying that the challenges are everywhere. Even when you do use AI, it's not going to be 100% perfect. No, we don't like to see it. It's - it expressly violates our guidelines. And then the challenge is how best effectively to remove it and to make sure that our platform is designed not to encourage the sorts of sentiments that would lead to that type of content being posted in the first place.
INSKEEP: Now, in our conversation, Amy Peikoff maintained that the Capitol attackers used other sources of disinformation, too. And that is true. NPR investigated the case of Ashli Babbitt, an attacker who was killed in the Capitol. Her social media feed showed that Babbitt gathered false claims from Fox News TV personalities and guests and repeated them on Twitter in the months before she went to her death. Even after Babbitt was killed, most Republicans in Congress voted, against all the evidence, to object to a Democratic election, following the lead of the defeated president. Whether Parler survives or not, disinformation is widespread.
(SOUNDBITE OF PHELIAN'S "INTRO") Transcript provided by NPR, Copyright NPR.