He’s too late to avoid QAnon with fact checks and account bans

Twitter is the best thing as a megaphone for the far right: its fashionable themes are easy to play, hounds spend too much time on it and, if you’re lucky, the president of the United States can just retwite it.

QAnon, the ever-evolving pro-Trump conspiracy theory, is smart on Twitter in the same way as other successful ideologies born from the Internet: the platform for manipulating information, attention, and distribution at the same time. On Tuesday, Twitter took steps to restrict QAnon’s success by adding the closure of around 7,000 accounts that sell the conspiracy, designating QAnon as a “coordinated destructive activity” and preventing related terms from appearing in trends and looking for results.

“We will permanently suspend tweet accounts on those topics that we know are related to violations of our multi-account policy, coordinating abuses around individual victims, or seeking to evade a past suspension,” Twitter announced. The corporation added that it had noticed an increase in those activities in recent weeks.

The New York Times reported that Facebook planned to “take steps to restrict the success of QAnon’s content on its platform” next month, and mentioned two corporate workers who spoke anonymously. On Friday, TikTok blocked several QAnon-related hashtags in search results.

This new drive to restrict QAnon’s success follows two high-profile crusades motivated by conspiracy theory. First American celebrity and style Chrissy Teigen, who has more than thirteen million fans on Twitter, has been targeted by an intense crusade of harassment. Then, more recently, QAnon’s accounts played a key role in spreading a theory of the false conspiracy of human trafficking in the Wayfair furniture market. Claims have spread from the Twitter trend bar to Instagram and TikTok accounts that sell the theory to their fans.

Experts and r/ChangeMyView subreddit moderators offer 10 tips to debunk conspiracy theories convincingly.

“That activity has raised the profile of the very long-standing problem of coordinated brigading. That kind of mass harassment has a significant impact on people’’s lives,” said Renee DiResta, research manager at the Stanford Internet Observatory and an expert in online disinformation. 

But mastering Twitter is just a small component of why QAnon exerts influence, and is just one example of how platforms magnify marginal ideals and destructive activities. To prevent QAnon, experts say, much more painting and coordination would be needed. In other words, if possible.

QAnon was born in 2017 after a joke by President Donald Trump at a news convention about a “calm before the storm” generated a series of mysterious messages about 4chan attributed to “Q,” predicting Hillary Clinton’s imminent arrest. Although that didn’t happen, “Q” kept publishing, ensuring to know everything about a secret plan led through Trump to prevent his enemies.

“QAnon originates from a cross-platform verbal exchange that began on social media, in a pseudonymous setting, where there are no consequences for speech,” says Brian Friedberg, principal investigator of the Technology and Social Change assignment at Harvard Shorenstein Center. . Messages have been moved from one site to another as a result of bans and now appear on an ad board called 8kun.

Posts have attracted subscribers who spend their time interpreting those messages, drawing conclusions, and campaigning to make messages more visible. Some QAnon members have conducted coordinated harassment campaigns against journalists, rival online communities, celebrities and liberal politicians. Others showed up at Trump’s rallies with “Q-themed” merchandise. The president has retwed Q-like Twitter accounts or conspiracy theories dozens of times, the consultation is open about how conscious he is of what Q is, beyond a motion backing his presidency on the Internet. And there have been several incidents of real violence related to QAnon supporters.

Covid-19 conspiracy theorists still get millions of perspectives on YouTube, even though the platform addresses misinformation about fitness.

QAnon’s classic understanding was that its concepts spread across a relatively small number of members who are incredibly intelligent in manipulating social media for maximum visibility. But the pandemic made it more complicated, as QAnon began to merge deeper with incorrect fitness information and increased its presence on Facebook.

At this point, QAnon has become a theory of omniconspiration, says DiResta: they are no longer just a few publications on the forums, but a broad motion that sells many other similar ideas. Researchers know that trust in a conspiracy theory can lead to acceptance from others, and algorithms of difficult advice on social media have necessarily driven this process. For example, according to DiResta, studies showed that members of Facebook’s anti-vaccine teams were getting advice from teams that promoted Pizzagate’s conspiracy theory in 2016.

“The advisory rule set turned out to have identified a correlation between users who shared the confidence that the government was hiding a secret fact. The details of the secret event varied,” he says.

Researchers have known for years that other platforms play other roles in coordinated campaigns. People will coordinate in a chat app, ad board, or Facebook group, direct their messages (including harassment and abuse) on Twitter, and host videos across the youTube set.

In this information ecosystem, Twitter functions more like a marketing campaign for QAnon: content is created to be seen and interacted with by outsiders. Meanwhile, Facebook is a powerhouse for coordination, especially in closed groups. 

Reddit was once a hub of QAnon’s business, until the site began to take strong action in 2018 to incite violence and repeated violations of its terms of service. But instead of wasting his power, QAnon simply turned to other conventional social media platforms where it is less likely to be banned.

All of this means that when a platform acts on its own to block or decrease the effect of QAnon, it attacks only one component of the problem.

Friedberg told him as if social media platforms were “waiting for an act of mass violence” to coordinate a more competitive demolition effort. But QAnon’s possible damage is already evident if he avoids looking at it as a pro-Trump interest and sees it instead as what it is: “a distribution mechanism for misinformation of all varieties,” Friedberg said, and which members are willing to do then. bray and identify with, regardless of the consequences.

Steven Hassan, an intellectual aptitude counselor and cult expert who escaped from Sun Myung Moon Unification Church, known as “Moonies,” says talking about equipment like QAnon as incorrect information or an algorithmic challenge is enough.

“I look at QAnon as a cult,” Hassan says. “When you get recruited into a mind control cult, and get indoctrinated into a new belief system … a lot of it is motivated by fear.” 

“They were 3 years of almost unlimited to grow and develop.”

“People can be out of program,” Hassan says. “But the other people who will be most successful are the members of the family and friends circle.” People who are already close to a QAnon supporter may be able to have “multiple interactions over time” in the hope of isolating the person.

If platforms wanted to seriously address ideologies like QAnon, they’d do much more than they are, he says.

First, Facebook teaches users not only how to stumble upon misinformation, but also how to perceive when they are manipulated through coordinated campaigns. Coordinated social media outbreaks are a major factor in QAnon’s development success on customer platforms, as recently documented by The Guardian. The organization explicitly followed the “information war” as a tactic to gain influence. In May, Facebook got rid of a small collection of non-QA-affiliated accounts for un authentic behavior.

And second, Hassan recommends that platforms prevent others from descending into algorithmic tunnels or QAnon-like advice, and feed them with the content of others like him, who have survived and escaped from the sects, especially those that have been absorbed and taken out of QAnon.

Friedberg, who has studied the movement in depth, says he believes he is “absolutely” too late for dominant social media platforms to avoid QAnon. There are a few things you can do to, for example, restrict the ability of your followers. evangelize on Twitter.

“They’ve had 3 years of almost unlimited access outside the doors of some platforms to grow and develop,” Friedberg says. In addition, QAnon supporters have an active relationship with the source of conspiracy theory, which constantly posts new content to decrypt and mentions the social media messages of Q supporters in their messages. Breaking QAnon’s influence would require breaking acceptance as true between “Q”, an unnamed character with no defining features, and his followers. Since Q’s long history of erroneous forecasts has not hurt you to accept as true, it is difficult, and critical media policy or disorganization still have much to do on this front. In any case, they simply pressure the believers of QAnon to assume that they are in something.

The most productive concepts for restricting QAnon would require radical replacement and introspection in the component of the other people who run the corporations in which he prospered. But even this week’s announcements, as dramatic as they may seem at first: Twitter has made it clear that it won’t automatically apply its new policies as opposed to politicians advertising QAnon content, adding several advertisers shown in the United States.

And, Friedberg said, QAnon supporters were “ready to verify those limitations, and to verify those limitations now.” For example, Twitter has banned the exchange of certain URLs affiliated with the plot, but others already have other URLs to use.

Leave a Comment

Your email address will not be published. Required fields are marked *