Parler Pas: Fringe Social Network Offline
9 minute read | Last updated 13-Jan-2021 |Culture Change, Social Media, Society, CBC
My fellow Canadians and other French speakers will get the title joke 😉
I discussed this issue with Robyn Bresnahan on CBC Ottawa Morning on 12-Jan-2021. Have a listen…
Parler[.]com, a social network that billed itself as a “nonpartisan Public Square” went offline on Sunday, the tenth of January. AWS disabled it’s account’s access to the outside world and removed them from their platform. As the dust settles, the following companies have stopped doing business with Parler;
- Apple (who has removed the Parler mobile app from it’s App Store)
- Google (who has removed the Parler mobile app from it’s Play Store)
- Their legal representation
Furthermore, Parler CEO John Matze has stated that they are finding it very difficult to find new service providers as companies are refusing to do business with the company.
Since it’s inception in 2018, Parler has been a soft landing spot for people who were either kicked off mainstream social media networks or felt “constrained” on those platforms.
This echo chamber resulted in the storming of the Capitol in Washington, D.C.. Numerous posts on Parler not only encouraged but also fomented the riot.
Is There Evidence?
Before Parler was taken offline, several articles in a number of publications showed examples of posts full of hate speech, planning explicit actions that appear to be criminal, and other posts that go well beyond “discussion”. Of course, these could be cherry picked examples.
During the final weekend of Parler being online, there was an organized effort to archive all of it’s content for further study and possible forwarding to law enforcement. While the legality of this activity is in question, the data shows that these example posts were not outliers. The network was full of speech that would be unacceptable almost anywhere.
What About Moderation?
A question that came up regularly around this issue was, “What about other networks? They have hate speech too!”
Yes. Yes, they do.
However the approach to moderation is what’s critical here. Section 230 of the Communications Decency Act in the United States provides a lot of protection to platforms. However, they still have legal teams that advise their leadership on how to navigate the question of “how far is too far?”
Parler had a content moderation approach. On the surface, their community guidelines and general approach might be a bit more strict than other networks. The difference is in how it was applied.
Most social networks check posts for inappropriate content and allow users to report content for review. THis sandwich approach isn’t perfect but there’s both a proactive and reactive element.
Parler did not check posts for inappropriate content. It allowed users to report content but it was then reviews by a jury pool of other users on the platform. Given that the main criteria for removing content was if it appeared to be “illegal” a jury of lay people with no guidance is completely unqualified to make that evaluation.
For other social networks, the network’s legal team helps define the rubric content should be evaluated against. That rubic not only includes legal liability issues but areas that would impact the networks ability to conduct business. Things like reputation, advertiser satisfaction, etc.
Again not perfect, but a more balanced approach.
Parler’s jury system didn’t just create an echo chamber. It created an unstoppable one.
There has been a ton of coverage on Parler lately. Here are some articles that will help you get a more rounded perspective on what happened;
- How Parler, a Chosen App of Trump Fans, Became a Test of Free Speech - The New York Times
- Tech giants put the squeeze on Parler as Trump seeks new online megaphone | CBC News
- Parler CEO says even his lawyers are abandoning him - The Verge
- Parler Forced Offline After Amazon Pulls Hosting Services
- Every Deleted Parler Post, Many With Users’ Location Data, Has Been Archived
- An Absurdly Basic Bug Let Anyone Grab All of Parler’s Data | WIRED
Transcript Of My Discussion With Robyn Bresnahan
[00:00:00] Robyn: The social media hub popular with the far right has gone dark. It’s called Parler. Google Play and Apple deleted the app from their stores for the role it played in planning last week’s attack on the Capitol. Amazon has also yanked the site’s web hosting services. Parler became the number one free downloaded app on Apple late last week after other social media sites de-platformed the U.S. president, Donald Trump. Our technology columnist, Mark Nunnikhoven, is here with more. Good morning, Mark.
[00:00:29] Mark: Good morning, Robyn.
[00:00:30] Robyn: For those who have never opened up Parler, just tell us what it is and how it works and who uses it.
[00:00:38] Mark: Sure. In a nutshell it is essentially a Twitter clone. It’s the same idea, where people can share short posts and then like other people’s posts and interact and have discussions. It was formed about two years ago and it’s quickly grown with a very specific far audience mainly after people were kicked off other social media platforms. And that’s because Parler has advocated or positioned itself as a non-partisan public square, even though that doesn’t hold up when you look beneath the covers.
[00:01:10] Robyn: Is it fair to say that Parler marketed itself to that far demographic from the start?
[00:01:15] Mark: Absolutely. It started with a few prominent extreme far right view points being de-platformed. This was well before this recent issue with the president. And they pushed to that platform because it was saying, hey, come to us. We’re not going to censor you. You’re allowed to freely express yourself here." and it also results as the founders of the financing behind this platform are very much associated with that a viewpoint, as well.
[00:01:40] Robyn: Freely express yourself to what end? Did Parler have any community guidelines?
[00:01:40] Mark: They did. And the interesting thing, and this is where, if you can step back from the reprehensible content on the platform, it’s an interesting approach to trying to build an online community. As we know from being on Facebook and being on Twitter and the other social networks content moderation kind of helps goes from two ways. There’s the users can report and say, “Hey, I think this is bad.” But also, the companies themselves are sweeping according to these community guidelines, so thinks no hate speech, no threats of violence, no harassment, that kind of thing.
[00:02:14] When it comes to Parler, everything it goes through the community. The community has to report, but then a set of approved users also then are the jury for those reports. So even though they do have community guidelines that essentially say anything that’s illegal is not going to be tolerated they’re not really enforced because the same people who are making those posts are the same people who are reviewing them to say whether or not they’re appropriate.
[00:02:40] Robyn: Mark, what’s known about the role that Parler played in last week’s riots?
[00:02:45] Mark: And this is where it gets challenging because it’s very difficult to point to one particular thing. However, the evidence shown from the posts on this network show that there’s a really high percentage of people who were at the rally who were at the riot and who were at who were discussing openly on Parler about actions leading up to it. So gathering people and planning their travel to the event actions while on during the day in Washington. All of these are out in the open whereas on another platform, as we’ve seen with Facebook and with Twitter, they still have some of this content, but they’ve been actively trying to take it down, whereas on Parler it was freely spread, if not encouraged.
[00:03:27] Robyn: What questions does it raise for you about the power of these big companies like Apple, Amazon, and Google? The power that they have to yank down a site like this?
[00:04:01] So every company within the US has the right to say who they’re going to do business with, and they all have these policies, whether they’re clear or not, as to what they’ll tolerate. The challenge is when Apple and Google take you off of their app stores, there is no alternative, so you can only reach smartphone users through the web now. And when a company like AWS takes your services away for hosting there’s not as that many alternatives at that scale that could host a platform that has 8 million users. So the challenge isn’t so much one of free speech as the reduction in options in the technology area we get into anti-trust discussions around monopolies and duopolies. It really gets complicated quickly.
[00:04:39] Robyn: There was one tech expert on The National last night saying this poses a problem because the extreme right will go elsewhere to voice their views, and somewhere that might be hard, harder to keep tabs on. Where do they go from here, do you think?
[00:04:52] Mark: And that’s a very valid point. This isn’t going to stop this community from sharing their views and from gathering with each other, but I read a really good analysis from some psychology and social researchers, and they said every time these kind of de-platforming happens, what ends up happening is you shake off the people who weren’t that committed. So you reduce this level, and you reduce the people who were just being brought into these extremist views so there’s an overall benefit there.
[00:05:18] But the nature of the internet is that you’re never going to stop these people completely. You’re never going to be able to completely remove people. It’s just a matter of removing their amplification and I think there’s definitely value in that.
[00:05:28] Robyn: Mark, thank you for your time this morning.
[00:05:31] Mark: Appreciate it. Have a good day.
[00:05:32] Robyn: You, too. That’s Mark Nunnikhoven. He’s our technology columnist here on Ottawa Morning.