YouTube didn’t violate Constitution by removing QAnon videos, federal judge rules

A federal judge has dismissed a lawsuit against YouTube for allegedly violating the Constitution by banning videos that promoted QAnon conspiracy theories. 

U.S. District Judge Beth Freeman issued her order Tuesday in the U.S. District Court Northern District of California San Jose Division, saying there was insufficient evidence. 

Fifteen identified and unidentified "conservative content creators" first filed a suit against YouTube’s parent company, Google, in October of 2020. They sought an emergency injunction saying, "YouTube abruptly deleted conservative content from its platform and terminated the accounts and channels that had hosted that content," according to court documents. 

QAnon centers its beliefs on the baseless belief that former President Donald Trump is waging a secret campaign against enemies in the "deep state" and a child sex trafficking ring run by satanic pedophiles and cannibals. What started as an online obsession for the far-right fringe has grown beyond its origins in a dark corner of the internet. QAnon has been creeping into the mainstream political arena for more than a year.

RELATED: House Jan. 6 panel refers Steve Bannon for criminal contempt over subpoena defiance

The group claimed YouTube’s actions violated their First Amendment and were politically motivated as the removals and suspensions came weeks before the November 2020 election. It also claimed YouTube’s actions "worked to the severe detriment of both conservative content creators and American voters who seek out their content."

In her decision, Freeman noted that YouTube is a private company and that the plaintiffs "failed to plead a proper First Amendment claim due to their failure to sufficiently allege that Defendants’ conduct constituted state action."

Freeman dismissed the First Amendment violation claim with prejudice, which prevents the group from filing an appeal in federal court on that cause of action.

RELATED: Trump sues to block release of documents to Jan. 6 committee

However, lawyers for the plaintiffs said they’re not giving up.

"Their case raises the question of whether the First Amendment protects them and you and all Americans from the government using private companies as a cat’s paw to achieve a goal—the censorship of dissenting views—that virtually everybody agrees the government could not achieve on its own," Attorney Cris Armenta said in a statement to FOX Television Stations. "We look forward to taking this case to the Ninth Circuit and beyond."

FOX Television Stations has also reached out to YouTube for comment. 

Meanwhile, many tech companies have come under scrutiny for not cracking down enough on misinformation. 

Following the permanent suspension of former President Donald Trump’s personal Twitter account in January, Twitter deactivated more than 70,000 accounts tied to the QAnon conspiracy theory. The company said the users were banned for being engaged "in sharing harmful QAnon-associated content at scale and were primarily dedicated to the propagation of this conspiracy theory across the service."

RELATED: Twitter removes more than 70K QAnon-associated accounts following Trump ban

Facebook also removed several groups, accounts and pages linked to QAnon, taking action for the first time in May 2020 against the far-right U.S. conspiracy theory circulated among supporters of Trump.

However, earlier this month, ex-Facebook employee Frances Haugen said to "60 Minutes" that whenever there was a conflict between the public good and what benefited the company, the social media giant would choose its own interests. She said Facebook prematurely turned off safeguards designed to thwart misinformation and rabble-rousing after Joe Biden defeated Donald Trump last year, alleging that contributed to the deadly Jan. 6 invasion of the U.S. Capitol.

"Facebook, over and over again, has shown it chooses profit over safety," she said.

Nick Clegg, Facebook’s vice president for global affairs, shot down the claim after being grilled by various media outlets about Facebook’s use of algorithms as well as its role in spreading harmful misinformation ahead of the Jan. 6 Capitol riots.

RELATED: Some Capitol riot defendants turning away defense lawyers

"We are constantly iterating in order to improve our products," Clegg told CNN."We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use."

With regards to the COVID-19 vaccine, YouTube announced last month a sweeping crackdown of misinformation that booted popular anti-vaccine influencers from its site and deleted false claims that have been made about a range of immunizations. The video-sharing platform said it will no longer allow users to baselessly speculate that approved vaccines, like the ones given to prevent the flu or measles, are dangerous or cause diseases.

RELATED: YouTube bans content containing misinformation about all vaccines

In March, Twitter also began labeling content that made misleading claims about COVID-19 vaccines and said it would ban accounts that repeatedly share such posts.

Nearly all Americans agree that the rampant spread of misinformation is a problem.

A poll from The Pearson Institute and The Associated Press-NORC Center for Public Affairs Research showed 95% of Americans identified misinformation as a problem when they’re trying to access important information. About half put a great deal of blame on the U.S. government, and about three-quarters point to social media users and tech companies. Yet only 2 in 10 Americans say they’re very concerned that they have personally spread misinformation.

The Associated Press and Austin Williams contributed to this report. This story was reported from Los Angeles. 




 

NewsU.S.TechnologyGoogleTechnologyU.S.FacebookBusinessNews