My fellow Canadians and other French speakers will get the title joke đ
I discussed this issue with Robyn Bresnahan on CBC Ottawa Morning on 12-Jan-2021.
What Happened?
Parler[.]com, a social network that billed itself as a ânonpartisan Public Squareâ went offline on Sunday, the tenth of January. AWS disabled itâs accountâs access to the outside world and removed them from their platform. As the dust settles, the following companies have stopped doing business with Parler;
- AWS
- Apple (who has removed the Parler mobile app from itâs App Store)
- Google (who has removed the Parler mobile app from itâs Play Store)
- Okta
- Twilio
- Their legal representation
Furthermore, Parler CEO John Matze has stated that they are finding it very difficult to find new service providers as companies are refusing to do business with the company.
Why?
Since itâs inception in 2018, Parler has been a soft landing spot for people who were either kicked off mainstream social media networks or felt âconstrainedâ on those platforms.
This echo chamber resulted in the storming of the Capitol in Washington, D.C.. Numerous posts on Parler not only encouraged but also fomented the riot.
Is There Evidence?
Before Parler was taken offline, several articles in a number of publications showed examples of posts full of hate speech, planning explicit actions that appear to be criminal, and other posts that go well beyond âdiscussionâ. Of course, these could be cherry picked examples.
They werenât.
During the final weekend of Parler being online, there was an organized effort to archive all of itâs content for further study and possible forwarding to law enforcement. While the legality of this activity is in question, the data shows that these example posts were not outliers. The network was full of speech that would be unacceptable almost anywhere.
What About Moderation?
A question that came up regularly around this issue was, âWhat about other networks? They have hate speech too!â
Yes. Yes, they do.
However the approach to moderation is whatâs critical here. Section 230 of the Communications Decency Act in the United States provides a lot of protection to platforms. However, they still have legal teams that advise their leadership on how to navigate the question of âhow far is too far?â
Parler had a content moderation approach. On the surface, their community guidelines and general approach might be a bit more strict than other networks. The difference is in how it was applied.
Most social networks check posts for inappropriate content and allow users to report content for review. THis sandwich approach isnât perfect but thereâs both a proactive and reactive element.
Parler did not check posts for inappropriate content. It allowed users to report content but it was then reviews by a jury pool of other users on the platform. Given that the main criteria for removing content was if it appeared to be âillegalâ a jury of lay people with no guidance is completely unqualified to make that evaluation.
For other social networks, the networkâs legal team helps define the rubric content should be evaluated against. That rubic not only includes legal liability issues but areas that would impact the networks ability to conduct business. Things like reputation, advertiser satisfaction, etc.
Again not perfect, but a more balanced approach.
Parlerâs jury system didnât just create an echo chamber. It created an unstoppable one.
More Reading
There has been a ton of coverage on Parler lately. Here are some articles that will help you get a more rounded perspective on what happened;
- How Parler, a Chosen App of Trump Fans, Became a Test of Free Speech - The New York Times
- Tech giants put the squeeze on Parler as Trump seeks new online megaphone | CBC News
- Parler CEO says even his lawyers are abandoning him - The Verge
- Parler Forced Offline After Amazon Pulls Hosting Services
- Every Deleted Parler Post, Many With Usersâ Location Data, Has Been Archived
- An Absurdly Basic Bug Let Anyone Grab All of Parlerâs Data | WIRED
Transcript Of My Discussion With Robyn Bresnahan
[00:00:00] Robyn: The social media hub popular with the far right has gone dark. Itâs called Parler. Google Play and Apple deleted the app from their stores for the role it played in planning last weekâs attack on the Capitol. Amazon has also yanked the siteâs web hosting services. Parler became the number one free downloaded app on Apple late last week after other social media sites de-platformed the U.S. president, Donald Trump. Our technology columnist, Mark Nunnikhoven, is here with more. Good morning, Mark.
[00:00:29] Mark: Good morning, Robyn.
[00:00:30] Robyn: For those who have never opened up Parler, just tell us what it is and how it works and who uses it.
[00:00:38] Mark: Sure. In a nutshell it is essentially a Twitter clone. Itâs the same idea, where people can share short posts and then like other peopleâs posts and interact and have discussions. It was formed about two years ago and itâs quickly grown with a very specific far audience mainly after people were kicked off other social media platforms. And thatâs because Parler has advocated or positioned itself as a non-partisan public square, even though that doesnât hold up when you look beneath the covers.
[00:01:10] Robyn: Is it fair to say that Parler marketed itself to that far demographic from the start?
[00:01:15] Mark: Absolutely. It started with a few prominent extreme far right view points being de-platformed. This was well before this recent issue with the president. And they pushed to that platform because it was saying, hey, come to us. Weâre not going to censor you. Youâre allowed to freely express yourself here." and it also results as the founders of the financing behind this platform are very much associated with that a viewpoint, as well.
[00:01:40] Robyn: Freely express yourself to what end? Did Parler have any community guidelines?
[00:01:40] Mark: They did. And the interesting thing, and this is where, if you can step back from the reprehensible content on the platform, itâs an interesting approach to trying to build an online community. As we know from being on Facebook and being on Twitter and the other social networks content moderation kind of helps goes from two ways. Thereâs the users can report and say, âHey, I think this is bad.â But also, the companies themselves are sweeping according to these community guidelines, so thinks no hate speech, no threats of violence, no harassment, that kind of thing.
[00:02:14] When it comes to Parler, everything it goes through the community. The community has to report, but then a set of approved users also then are the jury for those reports. So even though they do have community guidelines that essentially say anything thatâs illegal is not going to be tolerated theyâre not really enforced because the same people who are making those posts are the same people who are reviewing them to say whether or not theyâre appropriate.
[00:02:40] Robyn: Mark, whatâs known about the role that Parler played in last weekâs riots?
[00:02:45] Mark: And this is where it gets challenging because itâs very difficult to point to one particular thing. However, the evidence shown from the posts on this network show that thereâs a really high percentage of people who were at the rally who were at the riot and who were at who were discussing openly on Parler about actions leading up to it. So gathering people and planning their travel to the event actions while on during the day in Washington. All of these are out in the open whereas on another platform, as weâve seen with Facebook and with Twitter, they still have some of this content, but theyâve been actively trying to take it down, whereas on Parler it was freely spread, if not encouraged.
[00:03:27] Robyn: What questions does it raise for you about the power of these big companies like Apple, Amazon, and Google? The power that they have to yank down a site like this?
[00:03:37] Mark: Yeah, and this is where it gets really challenging because if you look at the discussion thatâs happening on the other social networks about this the claims of free speech and censorship keep popping up. And the challenge, is that free speech protections within the United States are against the government censoring things, and this is private companies saying that this network has violated their terms of use or their acceptable use. And what the challenge really goes is itâs a lack of alternatives.
[00:04:01] So every company within the US has the right to say who theyâre going to do business with, and they all have these policies, whether theyâre clear or not, as to what theyâll tolerate. The challenge is when Apple and Google take you off of their app stores, there is no alternative, so you can only reach smartphone users through the web now. And when a company like AWS takes your services away for hosting thereâs not as that many alternatives at that scale that could host a platform that has 8 million users. So the challenge isnât so much one of free speech as the reduction in options in the technology area we get into anti-trust discussions around monopolies and duopolies. It really gets complicated quickly.
[00:04:39] Robyn: There was one tech expert on The National last night saying this poses a problem because the extreme right will go elsewhere to voice their views, and somewhere that might be hard, harder to keep tabs on. Where do they go from here, do you think?
[00:04:52] Mark: And thatâs a very valid point. This isnât going to stop this community from sharing their views and from gathering with each other, but I read a really good analysis from some psychology and social researchers, and they said every time these kind of de-platforming happens, what ends up happening is you shake off the people who werenât that committed. So you reduce this level, and you reduce the people who were just being brought into these extremist views so thereâs an overall benefit there.
[00:05:18] But the nature of the internet is that youâre never going to stop these people completely. Youâre never going to be able to completely remove people. Itâs just a matter of removing their amplification and I think thereâs definitely value in that.
[00:05:28] Robyn: Mark, thank you for your time this morning.
[00:05:31] Mark: Appreciate it. Have a good day.
[00:05:32] Robyn: You, too. Thatâs Mark Nunnikhoven. Heâs our technology columnist here on Ottawa Morning.