Data box: What is QAnon and how do social media locate it?

(Reuters) – Social media Facebook Inc and Twitter Inc have announced strong action against content similar to the unfounded and expanding QAnon conspiracy theory.

WHAT IS QANON?

QAnon fans defend a series of intertwined beliefs, based on Internet publications with an unnamed “Q,” claiming to have privileged wisdom from the Trump administration.

A basic precept of conspiracy theory is that U.S. President Donald Trump is secretly fighting a clique of child sex predators that includes Democrats, Hollywood elites, and “deep state” allies.

QAnon, who borrows elements of the false “pizzagate” theory about a depleted paedophile network through a Washington restaurant, has a “big tent” conspiracy theory that covers incorrect information on topics ranging from extraterrestrial landings to vaccine safety.

QAnon’s followers say that a so-called Great Awakening comes to salvation.

HOW IS IT BROADCAST ONLINE?

“Q” messages, which began in 2017 on the 4chan forum, are now published in 8kun, a renamed edition of the 8chan panel. QAnon has been expanded on Twitter, Facebook, Instagram and YouTube, Google Inc.’s video streaming service.

Media surveys have shown that social media advisory algorithms can lead others who show interest in conspiracy theories to get more material.

A report through the Institute for Strategic Dialogue (ISD) found that the number of users in QAnon’s discussions on Twitter and Facebook has increased this year, and the number of memberships of QAnon organizations on Facebook expanded to 120% in March.

Researchers say russian government-backed organizations play an even developing role in amplifying conspiracy theories.

QAnon supporters helped organize genuine protests against child trafficking in August and were concerned at a pro-police demonstration in Portland, Oregon.

QAnon seems willing to take hold in the U.S. House of Representatives, with at least one Republican candidate defending his ideals on his way to winning the November election.

WHAT ARE SOCIAL PLATFORMS DOING ABOUT IT?

Twitter announced in July that it would avoid recommending QAnon’s content and accounts as a component of an offensive that he said would affect some 150,000 accounts. He also stated that he would block QAnon URLs and permanently suspend QAnon accounts by coordinating abuses or violating their rules.

In August, Facebook got rid of nearly 800 QAnon teams for posting articles celebrating violence, pretending their goal was to use weapons or attract fans with violent behavior. He also imposed restrictions on the remaining 1,950 public and personal QAnon teams he found. Facebook has announced its goal of banning classified ads that sell or reference QAnon, and does not allow QAnon pages to run retail stores.

A spokesperson for the abbreviated video app TikTok said QAnon’s content “frequently misinformation and hate speech” and has blocked dozens of QAnon hashtags.

A Reddit spokeswoman told Reuters that the site had been undone from QAnon communities that had violated their regulations since 2018, when it broke down from forums like r/Greatawakening.

A YouTube spokeswoman said she had removed tens of thousands of Q-related videos and closed many Q-related channels for violating its regulations since updating its hate speech policy in June 2019.

YouTube also said it cut out its recommendations for secure QAnon videos that “could uninform users in a destructive way.” There is no express prohibition on monetizing QAnon content. ISD researchers found that approximately 20% of all QAnon-like Facebook messages contained YouTube links.

Critics of major e-commerce sites Amazon.com Inc and Etsy Inc display pieces of the QAnon logo from the dealer directory ranging from books to T-shirts and face masks.

(Compiled through Elizabeth Culliford, Joseph Menn and Ted Hesson; edited through Greg Mitchell and Grant McCool)

Leave a Comment

Your email address will not be published. Required fields are marked *