Professionalism/Parler and the Capitol Insurrection of 2021

On January 6th, 2021, the United States Capitol in Washington D.C. was stormed by right-wing protestors and supporters of Donald Trump in an attempt to overturn the results of the 2020 presidential election. The rioters aimed to create havoc in joint session of Congress that convened to count electoral votes and officially claim Joe Biden's victory. Parler, an alt-tech social media platform that is a center for far-right content, played a key role in the planning and execution of the attack.

Growing Unrest
There was increasing unrest among conservative groups after the presidential election of 2020, in which Joe Biden was declared the winner. The race was very close, and it took over a week for Biden's victory to be clear due to the large amount of mail-in ballots cast because of the COVID-19 pandemic. Throughout the entire election week, Trump and other Republicans attempted to overturn the election results, claiming widespread voter fraud in the five key swing states that Biden managed to secure.

Donald Trump's campaign continued to subvert the election long after the results were complete. He held multiple rallies and filed over 60 lawsuits, two of which were brought to the supreme court. He also pressured key Republican governers to nullify their results by manufacturing fake evidence of fraud. Trump's frequent tweeting also galvanized his followers to call to reverse the election, which is both illegal and unconstitutional. This growing unrest caused his followers to plan the Capitol Insurrection.

Planning the Attack
As Donald Trump began to gather support and funding from various organizations, his team started putting together large rallies planned on January 6th. He gained funding from groups such as The Rule of Law Defense Fund and the Tea Party Patriots. The Women for America First non-profit organized a rally called the March to Save America, and far-right radio show host Alex Jones' media company donated $500,000 to book the Ellipse park for the event.

There were rumblings of violence and armed protests in the FBI, who were also worried of attacks at every state capitol that day. Despite this knowledge and the National Guard being deployed to the site, the severity of the take was largely unforeseen.

Alongside this, "alt-tech" platforms, which are sites that are similar alternatives to mainstream social networks, were used by the public to help plan the insurrection. These sites largely have a base of far-right extremists, conspiracy theorists, Donald Trump supporters, and conservatives. Among these apps was Parler, an alt-tech social network.

The Parler Social Network
Parler is a microblogging service that acts similarly to Twitter. It was launched in August 2018 after being founded by John Matze and Jared Thomson. The app received funding by conservative investor Rebekah Mercer and promotion by right-wing influencers and activists such as Ivanka Trump and Candace Owens, who touted it is a place to escape from the censorship of big-tech. Parler claims that it values true free speech above all else, and that it doesn't silence anyone on basis of opinion. However, this led to the app becoming a cesspool of misinformation, Holocaust denialism, Nazism, White supremacy, and QAnon conspiracies. Matze said in an interview with CNBC, "if you can say it on the street of New York, you can say it on Parler".

Parler gained traction as a social media platform following a tweet from right-wing commentator Candace Owens saying she had joined the platform. Following the implementation of new practices at companies like Facebook and Twitter, who began moderating content related to politics, Parler experienced another surge in new users. Anti-hate groups criticized Parler for creating a space with "the potential for extensive and worrying commingling of extremists and non-extremists" which can lead to radicalization.

Despite their hands-off approach to moderating content on their platform, Parler reports that they "referred violent content... to the FBI for investigation over 50 times, and... alerted law enforcement to specific threats of violence being planned at the Capitol". Furthermore, within the company's community guidelines, they specifically cite threats of violence and advocacy of imminent lawless action as violations. Parler also went on to say that the widespread backlash they received following the riots was "a coordinated and widespread disinformation campaign designed to scapegoat Parler for the riots at the U.S. Capitol on January 6, 2021".

Two days after the insurrection, Apple and Google both announced they would pull the app from their respective app stores, due to it being labeled a public safety threat. It was soon removed from the Amazon Web Service's servers as well.

Professionalism
The case of Parler and the insurrection at the Capitol provides a unique opportunity to examine the interface of professionalism and corporate responsibility. Parler kept true to their word of being “a free speech-focused... unbiased alternative” as users with far-right ideologies were allowed to organized the riots. In the aftermath of the riots, critics were quick to blame Parler for their platform permitting users to organize an intentionally non-peaceful riot with impunity. Parler's case for free expression, despite the backlash following the riots, is not absurd at first glance. Their case raises the question of "How far is too far?" when it comes to supporting company values and reveals the failings of the FCC to effectively regulate social media within the limits of existing laws. Recently, a law called Section 230 has come under scrutiny for this very reason. The law effectively shields social media platforms from liability from what their users post to their websites.

Responsibility
Parler may also not be entirely to blame for the riots at the Capitol. There may be evidence that the policies and practices of other social media companies may have bolstered the extremism that found its voice through Parler. One of the driving factors behind the rise of Parler was the growing sentiment that larger social media platforms like Facebook and Twitter were censoring political viewpoints. These practices in addition to the banning of former President Trump's account led many far-right Americans to the platform. As these far-right individuals found a place where they could speak freely without their messages being flagged for potential misinformation, it can be argued that Facebook and Twitter effectively funneled their far-right users into an echo chamber that likely exacerbated their feelings of discontent and frustration.

Section 230 of the Communications Decency Act
A cornerstone of governing speech online is Section 230 of the Communications Decency Act, Enacted in 1996. It states "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This states that ISPs and social media sites are not responsible for the content that is shown on their website.

President Biden, while on the 2020 campaign trail criticized Section 230 and argued for its revocation. Deceitful ads were run on behalf of the Trump campaign, stating without evidence that Biden paid the Ukrainian attorney general 1 billion dollars to not investigate Biden’s son. Biden stated that Facebook should be held responsible for false and damaging statements allowed on Facebook the same way traditional publishers are. Section 230 is controversial because it can grant immunity to social media companies from fact checking misleading or false information. Section 230 however is commonly referred to as “The 26 words that made the internet.” The growth of sites like twitter Facebook and YouTube can be attributed to it as it allowed these platforms to host user posted content without being liable for exactly what information is posted. The internet would probably not look the same without this legislation which is why talks to repeal it are tumultuous

Mail In Ballots
Preceding the 2020 Presidential election, there was a consensus to shift to using mail in ballots because of the potential for voting locations to facilitate COVID-19 transmission. Donald Trump tweeted unsubstantiated that mail in ballots would lead to a fraudulent election. Twitter, for the first time ever included a hyperlink below his tweet to give information about the topic he was tweeting about, and they would continue to mark his tweets that might have been spreading misinformation about things like COVID-19. Facebook CEO, Mark Zuckerberg, gave his opinion on twitter’s strategy saying Facebook is not an arbiter of truth, essentially going in line with what Section 230 states. He provides the platform but is not responsible for all the speech that is stated on it.

The Responsibility of Social Media Companies
While Facebook can’t be expected to make sure that every post on their platform is factually correct, they curate much of the content that one sees on their social media feed. Social media sites generate revenue from keeping users on their apps, thus serve users content they think will keep them engaged. This can lead to dangerous and misleading echo chambers. Social media sites drive users into their own unchecked reality which can be a breeding ground for extremism. Facebook later implicitly acknowledged this by banning conspiracy theory driven accounts across all their platforms.

Section 230 provides good protections to social media companies to not be complicit in everything that’s posted to their sites, but when their own algorithms lead violent insurrections, the line can be blurred between what is just being posted on the site and what the social media sites are curating for the users themselves.