new survey by the Pew Research Center found that 15 percent of prominent accounts on those seven platforms had previously been banished from others like Twitter and Facebook.

F.B.I. raid on Mar-a-Lago thrust his latest pronouncements into the eye of the political storm once again.

study of Truth Social by Media Matters for America, a left-leaning media monitoring group, examined how the platform had become a home for some of the most fringe conspiracy theories. Mr. Trump, who began posting on the platform in April, has increasingly amplified content from QAnon, the online conspiracy theory.

He has shared posts from QAnon accounts more than 130 times. QAnon believers promote a vast and complex conspiracy that centers on Mr. Trump as a leader battling a cabal of Democratic Party pedophiles. Echoes of such views reverberated through Republican election campaigns across the country during this year’s primaries.

Ms. Jankowicz, the disinformation expert, said the nation’s social and political divisions had churned the waves of disinformation.

The controversies over how best to respond to the Covid-19 pandemic deepened distrust of government and medical experts, especially among conservatives. Mr. Trump’s refusal to accept the outcome of the 2020 election led to, but did not end with, the Capitol Hill violence.

“They should have brought us together,” Ms. Jankowicz said, referring to the pandemic and the riots. “I thought perhaps they could be kind of this convening power, but they were not.”

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

How a Spreader of Voter Fraud Conspiracy Theories Became a Star

In 2011, Catherine Engelbrecht appeared at a Tea Party Patriots convention in Phoenix to deliver a dire warning.

While volunteering at her local polls in the Houston area two years earlier, she claimed, she witnessed voter fraud so rampant that it made her heart stop. People cast ballots without proof of registration or eligibility, she said. Corrupt election judges marked votes for their preferred candidates on the ballots of unwitting citizens, she added.

Local authorities found no evidence of the election tampering she described, but Ms. Engelbrecht was undeterred. “Once you see something like that, you can’t forget it,” the suburban Texas mom turned election-fraud warrior told the audience of 2,000. “You certainly can’t abide by it.”

planting seeds of doubt over the electoral process, becoming one of the earliest and most enthusiastic spreaders of ballot conspiracy theories.

fueled by Mr. Trump, has seized the moment. She has become a sought-after speaker at Republican organizations, regularly appears on right-wing media and was the star of the recent film “2,000 Mules,” which claimed mass voter fraud in the 2020 election and has been debunked.

She has also been active in the far-right’s battle for November’s midterm elections, rallying election officials, law enforcement and lawmakers to tighten voter restrictions and investigate the 2020 results.

said in an interview last month with a conservative show, GraceTimeTV, which was posted on the video-sharing site Rumble. “There have been no substantive improvements to change anything that happened in 2020 to prevent it from happening in 2022.”

set up stakeouts to prevent illegal stuffing of ballot boxes. Officials overseeing elections are ramping up security at polling places.

Voting rights groups said they were increasingly concerned by Ms. Engelbrecht.

She has “taken the power of rhetoric to a new place,” said Sean Morales-Doyle, the acting director of voting rights at the Brennan Center, a nonpartisan think tank. “It’s having a real impact on the way lawmakers and states are governing elections and on the concerns we have on what may happen in the upcoming elections.”

Some of Ms. Engelbrecht’s former allies have cut ties with her. Rick Wilson, a Republican operative and Trump critic, ran public relations for Ms. Engelbrecht in 2014 but quit after a few months. He said she had declined to turn over data to back her voting fraud claims.

“She never had the juice in terms of evidence,” Mr. Wilson said. “But now that doesn’t matter. She’s having her uplift moment.”

a video of the donor meeting obtained by The New York Times. They did not elaborate on why.

announce a partnership to scrutinize voting during the midterms.

“The most important right the American people have is to choose our own public officials,” said Mr. Mack, a former sheriff of Graham County, Ariz. “Anybody trying to steal that right needs to be prosecuted and arrested.”

Steve Bannon, then chief executive of the right-wing media outlet Breitbart News, and Andrew Breitbart, the publication’s founder, spoke at her conferences.

True the Vote’s volunteers scrutinized registration rolls, watched polling stations and wrote highly speculative reports. In 2010, a volunteer in San Diego reported seeing a bus offloading people at a polling station “who did not appear to be from this country.”

Civil rights groups described the activities as voter suppression. In 2010, Ms. Engelbrecht told supporters that Houston Votes, a nonprofit that registered voters in diverse communities of Harris County, Texas, was connected to the “New Black Panthers.” She showed a video of an unrelated New Black Panther member in Philadelphia who called for the extermination of white people. Houston Votes was subsequently investigated by state officials, and law enforcement raided its office.

“It was a lie and racist to the core,” said Fred Lewis, head of Houston Votes, who sued True the Vote for defamation. He said he had dropped the suit after reaching “an understanding” that True the Vote would stop making accusations. Ms. Engelbrecht said she didn’t recall such an agreement.

in April 2021, did not respond to requests for comment. Ms. Engelbrecht has denied his claims.

In mid-2021, “2,000 Mules” was hatched after Ms. Engelbrecht and Mr. Phillips met with Dinesh D’Souza, the conservative provocateur and filmmaker. They told him that they could detect cases of ballot box stuffing based on two terabytes of cellphone geolocation data that they had bought and matched with video surveillance footage of ballot drop boxes.

Salem Media Group, the conservative media conglomerate, and Mr. D’Souza agreed to create and fund a film. The “2,000 Mules” title was meant to evoke the image of cartels that pay people to carry illegal drugs into the United States.

said after seeing the film that it raised “significant questions” about the 2020 election results; 17 state legislators in Michigan also called for an investigation into election results there based on the film’s accusations.

In Arizona, the attorney general’s office asked True the Vote between April and June for data about some of the claims in “2,000 Mules.” The contentions related to Maricopa and Yuma Counties, where Ms. Engelbrecht said people had illegally submitted ballots and had used “stash houses” to store fraudulent ballots.

According to emails obtained through a Freedom of Information Act request, a True the Vote official said Mr. Phillips had turned over a hard drive with the data. The attorney general’s office said early this month that it hadn’t received it.

Last month, Ms. Engelbrecht and Mr. Phillips hosted an invitation-only gathering of about 150 supporters in Queen Creek, Ariz., which was streamed online. For weeks beforehand, they promised to reveal the addresses of ballot “stash houses” and footage of voter fraud.

Ms. Engelbrecht did not divulge the data at the event. Instead, she implored the audience to look to the midterm elections, which she warned were the next great threat to voter integrity.

“The past is prologue,” she said.

Alexandra Berzon contributed reporting.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Robert Malone Spreads Falsehoods About Vaccines. He Also Says He Invented Some.

“And almost without exception, these influencers feel that they have been wronged by mainstream society in some way,” Mr. Brooking added.

Dr. Malone earned a medical degree from Northwestern University in 1991, and for the next decade taught pathology at the University of California, Davis, and the University of Maryland. He then turned to biotech start-ups and consulting. His résumé says he was “instrumental” in securing early-stage approval for research on the Ebola vaccine by the pharmaceutical company Merck in the mid-2010s. He also worked on repurposing drugs to treat Zika.

In extended interviews at his home over two days, Dr. Malone said he was repeatedly not recognized for his contributions over the course of his career, his voice low and grave as he recounted perceived slights by the institutions he had worked for. His wife, Dr. Jill Glasspool Malone, paced the room and pulled up articles on her laptop that she said supported his complaints.

The example he points to more frequently is from his time at the Salk Institute for Biological Studies in San Diego. While there, he performed experiments that showed how human cells could absorb an mRNA cocktail and produce proteins from it. Those experiments, he says, make him the inventor of mRNA vaccine technology.

“I was there,” Dr. Malone said. “I wrote all the invention.”

What the mainstream media did instead, he said, was give credit for the mRNA vaccines to the scientists Katalin Kariko and Drew Weissman, because there “is a concerted campaign to get them the Nobel Prize” by Pfizer and BioNTech, where Dr. Kariko is a senior vice president, as well as the University of Pennsylvania, where Dr. Weissman leads a laboratory researching vaccines and infectious diseases.

But at the time he was conducting those experiments, it was not known how to protect the fragile RNA from the immune system’s attack, scientists say. Former colleagues said they had watched in astonishment as Dr. Malone began posting on social media about why he deserved to win the Nobel Prize.

The idea that he is the inventor of mRNA vaccines is “a totally false claim,” said Dr. Gyula Acsadi, a pediatrician in Connecticut who along with Dr. Malone and five others wrote a widely cited paper in 1990 showing that injecting RNA into muscle could produce proteins. (The Pfizer and Moderna vaccines work by injecting RNA into arm muscles that produce copies of the “spike protein” found on the outside of the coronavirus. The human immune system identifies that protein, attacks it and then remembers how to defeat it.)

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Trump’s Truth Social Is Poised to Join a Crowded Field

For months, former President Donald J. Trump has promoted Truth Social, the soon-to-be-released flagship app of his fledging social media company, as a platform where free speech can thrive without the constraints imposed by Big Tech.

At least seven other social media companies have promised to do the same.

Gettr, a right-wing alternative to Twitter founded last year by a former adviser to Mr. Trump, bills itself as a haven from censorship. That’s similar to Parler — essentially another Twitter clone backed by Rebekah Mercer, a big donor to the Republican Party. MeWe and CloutHub are similar to Facebook, but with the pitch that they promote speech without restraint.

Truth Social was supposed to go live on Presidents’ Day, but the start date was recently pushed to March, though a limited test version was unveiled recently. A full rollout could be hampered by a regulatory investigation into a proposed merger of its parent company, the Trump Media & Technology Group, with a publicly traded blank-check company.

If and when it does open its doors, Mr. Trump’s app will be the newest — and most conspicuous — entrant in the tightly packed universe of social media companies that have cropped up in recent years, promising to build a parallel internet after Twitter, Facebook, Google and other mainstream platforms began to crack down on hate speech.

211 million daily active users on Twitter who see ads.

Many people who claim to crave a social network that caters to their political cause often aren’t ready to abandon Twitter or Facebook, said Weiai Xu, an assistant professor of communications at the University of Massachusetts-Amherst. So the big platforms remain important vehicles for “partisan users” to get their messages out, Mr. Xu said.

Gettr, Parler and Rumble have relied on Twitter to announce the signing of a new right-wing personality or influencer. Parler, for instance, used Twitter to post a link to an announcement that Melania Trump, the former first lady, was making its platform her “social media home.”

Alternative social media companies mainly thrive off politics, said Mark Weinstein, the founder of MeWe, a platform with 20 million registered users that has positioned itself as an option to Facebook.

certain subscription services. His start-up has raised $24 million from 100 investors.

But since political causes drive the most engagement for alternative social media, most other platforms are quick to embrace such opportunities. This month, CloutHub, which has just four million registered users, said its platform could be used to raise money for the protesting truckers of Ottawa.

Mr. Trump wasn’t far behind. “Facebook and Big Tech are seeking to destroy the Freedom Convoy of Truckers,” he said in a statement. (Meta, the parent company of Facebook, said it removed several groups associated with the convoy for violating their rules.)

Trump Media, Mr. Trump added, would let the truckers “communicate freely on Truth Social when we launch — coming very soon!”

Of all the alt-tech sites, Mr. Trump’s venture may have the best chance of success if it launches, not just because of the former president’s star power but also because of its financial heft. In September, Trump Media agreed to merge with Digital World Acquisition, a blank-check or special purpose acquisition company that raised $300 million. The two entities have raised $1 billion from 36 investors in a private placement.

But none of that money can be tapped until regulators wrap up their inquiry into whether Digital World flouted securities regulations in planning its merger with Trump Media. In the meantime, Trump Media, currently valued at more than $10 billion based on Digital World’s stock price, is trying to hire people to build its platform.

Trump supporter, and the venture fund of Mr. Thiel’s protégé J.D. Vance, who is running for a Senate seat from Ohio.

Rumble is also planning to go public through a merger with a special-purpose acquisition company. SPACs are shell companies created solely for the purpose of merging with an operating entity. The deal, arranged by the Wall Street firm Cantor Fitzgerald, will give Rumble $400 million in cash and a $2.1 billion valuation.

The site said in January that it had 39 million monthly active users, up from two million two years ago. It has struck various content deals, including one to provide video and streaming services to Truth Social. Representatives for Rumble did not respond to requests for comment.

removed it from their app stores and Amazon cut off web services after the riot, according to SensorTower, a digital analytics company.

John Matze, one of its founders, from his position as chief executive. Mr. Matze has said he was dismissed after a dispute with Ms. Mercer — the daughter of a wealthy hedge fund executive who is Parler’s main backer — over how to deal with extreme content posted on the platform.

Christina Cravens, a spokeswoman for Parler, said the company had always “prohibited violent and inciting content” and had invested in “content moderation best practices.”

Moderating content will also be a challenge for Truth Social, whose main star, Mr. Trump, has not been able to post messages since early 2021, when Twitter and Facebook kicked him off their platforms for inciting violence tied to the outcome of the 2020 presidential election.

With Mr. Trump as its main poster, it was unclear if Truth Social would grow past subscribers who sign up simply to read the former president’s missives, Mr. Matze said.

“Trump is building a community that will fight for something or whatever he stands for that day,” he said. “This is not social media for friends and family to share pictures.”

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

In India, Facebook Struggles to Combat Misinformation and Hate Speech

On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India.

For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook’s algorithms to join groups, watch videos and explore new pages on the site.

The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month.

bots and fake accounts tied to the country’s ruling party and opposition figures were wreaking havoc on national elections. They also detail how a plan championed by Mark Zuckerberg, Facebook’s chief executive, to focus on “meaningful social interactions,” or exchanges between friends and family, was leading to more misinformation in India, particularly during the pandemic.

a violent coup in the country. Facebook said that after the coup, it implemented a special policy to remove praise and support of violence in the country, and later banned the Myanmar military from Facebook and Instagram.

In Sri Lanka, people were able to automatically add hundreds of thousands of users to Facebook groups, exposing them to violence-inducing and hateful content. In Ethiopia, a nationalist youth militia group successfully coordinated calls for violence on Facebook and posted other inflammatory content.

Facebook has invested significantly in technology to find hate speech in various languages, including Hindi and Bengali, two of the most widely used languages, Mr. Stone said. He added that Facebook reduced the amount of hate speech that people see globally by half this year.

suicide bombing in the disputed border region of Kashmir set off a round of violence and a spike in accusations, misinformation and conspiracies between Indian and Pakistani nationals.

After the attack, anti-Pakistan content began to circulate in the Facebook-recommended groups that the researcher had joined. Many of the groups, she noted, had tens of thousands of users. A different report by Facebook, published in December 2019, found Indian Facebook users tended to join large groups, with the country’s median group size at 140,000 members.

Graphic posts, including a meme showing the beheading of a Pakistani national and dead bodies wrapped in white sheets on the ground, circulated in the groups she joined.

After the researcher shared her case study with co-workers, her colleagues commented on the posted report that they were concerned about misinformation about the upcoming elections in India.

Two months later, after India’s national elections had begun, Facebook put in place a series of steps to stem the flow of misinformation and hate speech in the country, according to an internal document called Indian Election Case Study.

The case study painted an optimistic picture of Facebook’s efforts, including adding more fact-checking partners — the third-party network of outlets with which Facebook works to outsource fact-checking — and increasing the amount of misinformation it removed. It also noted how Facebook had created a “political white list to limit P.R. risk,” essentially a list of politicians who received a special exemption from fact-checking.

The study did not note the immense problem the company faced with bots in India, nor issues like voter suppression. During the election, Facebook saw a spike in bots — or fake accounts — linked to various political groups, as well as efforts to spread misinformation that could have affected people’s understanding of the voting process.

In a separate report produced after the elections, Facebook found that over 40 percent of top views, or impressions, in the Indian state of West Bengal were “fake/inauthentic.” One inauthentic account had amassed more than 30 million impressions.

A report published in March 2021 showed that many of the problems cited during the 2019 elections persisted.

In the internal document, called Adversarial Harmful Networks: India Case Study, Facebook researchers wrote that there were groups and pages “replete with inflammatory and misleading anti-Muslim content” on Facebook.

The report said there were a number of dehumanizing posts comparing Muslims to “pigs” and “dogs,” and misinformation claiming that the Quran, the holy book of Islam, calls for men to rape their female family members.

Much of the material circulated around Facebook groups promoting Rashtriya Swayamsevak Sangh, an Indian right-wing and nationalist group with close ties to India’s ruling Bharatiya Janata Party, or B.J.P. The groups took issue with an expanding Muslim minority population in West Bengal and near the Pakistani border, and published posts on Facebook calling for the ouster of Muslim populations from India and promoting a Muslim population control law.

Facebook knew that such harmful posts proliferated on its platform, the report indicated, and it needed to improve its “classifiers,” which are automated systems that can detect and remove posts containing violent and inciting language. Facebook also hesitated to designate R.S.S. as a dangerous organization because of “political sensitivities” that could affect the social network’s operation in the country.

Of India’s 22 officially recognized languages, Facebook said it has trained its A.I. systems on five. (It said it had human reviewers for some others.) But in Hindi and Bengali, it still did not have enough data to adequately police the content, and much of the content targeting Muslims “is never flagged or actioned,” the Facebook report said.

Five months ago, Facebook was still struggling to efficiently remove hate speech against Muslims. Another company report detailed efforts by Bajrang Dal, an extremist group linked with the B.J.P., to publish posts containing anti-Muslim narratives on the platform.

Facebook is considering designating the group as a dangerous organization because it is “inciting religious violence” on the platform, the document showed. But it has not yet done so.

“Join the group and help to run the group; increase the number of members of the group, friends,” said one post seeking recruits on Facebook to spread Bajrang Dal’s messages. “Fight for truth and justice until the unjust are destroyed.”

Ryan Mac, Cecilia Kang and Mike Isaac contributed reporting.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

What Happened When Facebook Employees Warned About Election Misinformation

WHAT HAPPENED

1. From Wednesday through Saturday there was a lot of content circulating which implied fraud in the election, at around 10% of all civic content and 1-2% of all US VPVs. There was also a fringe of incitement to violence.

2. There were dozens of employees monitoring this, and FB launched ~15 measures prior to the election, and another ~15 in the days afterwards. Most of the measures made existings processes more aggressive: e.g. by lowering thresholds, by making penalties more severe, or expanding eligibility for existing measures. Some measures were qualitative: reclassifying certain types of content as violating, which had not been before.

3. I would guess these measures reduced prevalence of violating content by at least 2X. However they had collateral damage (removing and demoting non-violating content), and the episode caused noticeable resentment by Republican Facebook users who feel they are being unfairly targeted.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Mob Violence Against Palestinians in Israel Is Fueled by Groups on WhatsApp

Last Wednesday, a message appeared in a new WhatsApp channel called “Death to the Arabs.” The message urged Israelis to join a mass street brawl against Palestinian citizens of Israel.

Within hours, dozens of other new WhatsApp groups popped up with variations of the same name and message. The groups soon organized a 6 p.m. start time for a clash in Bat Yam, a town on Israel’s coast.

“Together we organize and together we act,” read a message in one of the WhatsApp groups. “Tell your friends to join the group, because here we know how to defend Jewish honor.”

That evening, live scenes aired of black-clad Israelis smashing car windows and roaming the streets of Bat Yam. The mob pulled one man they presumed to be Arab from his car and beat him unconscious. He was hospitalized in serious condition.

violence between Israelis and Palestinians escalated last week, at least 100 new WhatsApp groups have been formed for the express purpose of committing violence against Palestinians, according to an analysis by The New York Times and FakeReporter, an Israeli watchdog group that studies misinformation.

The WhatsApp groups, with names like “The Jewish Guard” and “The Revenge Troops,” have added hundreds of new members a day over the past week, according to The Times’s analysis. The groups, which are in Hebrew, have also been featured on email lists and online message boards used by far-right extremists in Israel.

While social media and messaging apps have been used in the past to spread hate speech and inspire violence, these WhatsApp groups go further, researchers said. That’s because the groups are explicitly planning and executing violent acts against Palestinian citizens of Israel, who make up roughly 20 percent of the population and live largely integrated lives with Jewish neighbors.

That is far more specific than past WhatsApp-fueled mob attacks in India, where calls for violence were vague and generally not targeted at individuals or businesses, the researchers said. Even the Stop the Steal groups in the United States that organized the Jan. 6 protests in Washington did not openly direct attacks using social media or messaging apps, they said.

The proliferation of these WhatsApp groups has alarmed Israeli security officials and disinformation researchers. In the groups, attacks have been carefully documented, with members often gloating about taking part in the violence, according to The Times’s review. Some said they were taking revenge for rockets being fired onto Israel from militants in the Gaza Strip, while others cited different grievances. Many solicited names of Arab-owned businesses they could target next.

do not plan attacks on the services for fear of being discovered.

A WhatsApp spokeswoman said the messaging service was concerned by the activity from Israeli extremists. She said the company had removed some accounts of people who participated in the groups. WhatsApp cannot read the encrypted messages on its service, she added, but it has acted when accounts were reported to it for violating its terms of service.

“We take action to ban accounts we believe may be involved in causing imminent harm,” she said.

In Israel, WhatsApp has long been used to form groups so people can communicate and share interests or plan school activities. As violence soared between Israel’s military and Palestinian militants in Gaza over the past week, WhatsApp was also one of the platforms where false information about the conflict has spread.

Tensions in the area ran so high that new groups calling for revenge against Palestinians began emerging on WhatsApp and on other messaging services like Telegram. The first WhatsApp groups appeared last Tuesday, Mr. Schatz said. By last Wednesday, his organization had found dozens of the groups.

People can join the groups through a link, many of which are shared within existing WhatsApp groups. Once they have joined one group, other groups are advertised to them.

The groups have since grown steadily in size, Mr. Schatz said. Some have become so big that they have branched off into local chapters that are dedicated to certain cities and towns. To evade detection by WhatsApp, organizers of the groups are urging people to vet new members, he said.

On Telegram, Israelis have formed roughly 20 channels to commit and plan violence against Palestinians, according to FakeReporter. Much of the content and messaging in those groups imitates what is in the WhatsApp channels.

On one new WhatsApp group that The Times reviewed, “The Revenge Troops,” people recently shared instructions for how to build Molotov cocktails and makeshift explosives. The group asked its 400 members to also provide addresses of Arab-owned businesses that could be targeted.

In another group with just under 100 members, people shared photos of guns, knives and other weapons as they discussed engaging in street combat in mixed Jewish-Arab cities. Another new WhatsApp group was named “The unapologetic right-wing group.”

After participating in attacks, members of the groups posted photos of their exploits and encouraged others to mimic them.

“We destroyed them, we left them in pieces,” said one person in “The Revenge Troops” WhatsApp group, alongside a photo showing smashed car windows. In a different group, a video was uploaded of black-clad Jewish youths stopping cars on an unnamed street and asking drivers if they were Jewish or Arab.

We beat “the enemy car-by-car,” said a comment posted underneath the video, using an expletive.

Over the weekend, Prime Minister Benjamin Netanyahu of Israel visited Lod, a mixed Jewish-Arab city in central Israel that has been the scene of recent clashes.

“There is no greater threat now than these riots, and it is essential to bring back law and order,” said Mr. Netanyahu.

Within some of the WhatsApp groups, Mr. Netanyahu’s calls for peace were ridiculed.

“Our government is too weak to do what is necessary, so we take it into our own hands,” wrote one person in a WhatsApp group dedicated to city of Ramle in central Israel. “Now that we have organized, they can’t stop us.”

Ben Decker contributed research.

View

Extremists Find a Financial Lifeline on Twitch

Terpsichore Maras-Lindeman, a podcaster who fought to overturn the 2020 presidential election, recently railed against mask mandates to her 4,000 fans in a live broadcast and encouraged them to enter stores maskless. On another day, she grew emotional while thanking them for sending her $84,000.

Millie Weaver, a former correspondent for the conspiracy theory website Infowars, speculated on her channel that coronavirus vaccines could be used to surveil people. Later, she plugged her merchandise store, where she sells $30 “Drain the Swamp” T-shirts and hats promoting conspiracies.

And a podcaster who goes by Zak Paine or Redpill78, who pushes the baseless QAnon conspiracy theory, urged his viewers to donate to the congressional campaign of an Ohio man who has said he attended the “Stop the Steal” rally in Washington on Jan. 6.

Facebook, YouTube and other social media platforms clamped down on misinformation and hate speech ahead of the 2020 election.

apps like Google Podcasts, where far-right influencers have scattered as their options for spreading falsehoods have dwindled.

Twitch became a multibillion-dollar business thanks to video gamers broadcasting their play of games like Fortnite and Call of Duty. Fans, many of whom are young men, pay the gamers by subscribing to their channels or donating money. Streamers earn even more by sending their fans to outside sites to either buy merchandise or donate money.

Now Twitch has also become a place where right-wing personalities spread election and vaccine conspiracy theories, often without playing any video games. It is part of a shift at the platform, where streamers have branched out from games into fitness, cooking, fishing and other lifestyle topics in recent years.

But unlike fringe livestreaming sites like Dlive and Trovo, which have also offered far-right personalities moneymaking opportunities, Twitch attracts far larger audiences. On average, 30 million people visit the site each day, the platform said.

stricter rules than other social media platforms for the kinds of views that users can express. It temporarily suspended Mr. Trump’s account for “hateful conduct” last summer, months before Facebook and Twitter made similar moves. Its community guidelines prohibit hateful conduct and harassment. Ms. Clemens said Twitch was developing a misinformation policy.

This month, Twitch announced a policy that would allow it to suspend the accounts of people who committed crimes or severe offenses in real life or on other social media platforms, including violent extremism or membership in a known hate group. Twitch said it did not consider QAnon to be a hate group.

Despite all this, a Twitch channel belonging to Enrique Tarrio, the leader of the Proud Boys, a white nationalist organization, remained online until the middle of this month after The New York Times inquired about it. And the white nationalist Anthime Joseph Gionet, known as Baked Alaska, had a Twitch channel for months, even though he was arrested in January by the F.B.I. and accused of illegally storming the U.S. Capitol on Jan. 6. Twitch initially said his activities had not violated the platform’s policies, then barred him this month for hateful conduct.

has said is dangerous. Last week, he referred to a QAnon belief that people are killing children to “harvest” a chemical compound from them, then talked about a “criminal cabal” controlling the government, saying people do not understand “what plane of existence they come from.”

Mr. Paine, who is barred from Twitter and YouTube, has also asked his Twitch audience to donate to the House campaign of J.R. Majewski, an Air Force veteran in Toledo, Ohio, who attracted attention last year for painting his lawn to look like a Trump campaign banner. Mr. Majewski has used QAnon hashtags but distanced himself from the movement in an interview with his local newspaper, The Toledo Blade.

Mr. Majewski has appeared on Mr. Paine’s streams, where they vape, chat about Mr. Majewski’s campaign goals and take calls from listeners.

“He is exactly the type of person that we need to get in Washington, D.C., so that we can supplant these evil cabal criminal actors and actually run our own country,” Mr. Paine said on one stream.

Neither Mr. Paine nor Mr. Majewski responded to a request for comment.

Joan Donovan, a Harvard University researcher who studies disinformation and online extremism, said streamers who rely on their audience’s generosity to fund themselves felt pressured to continue raising the stakes.

“The incentive to lie, cheat, steal, hoax and scam is very high when the cash is easy to acquire,” she said.

View

Is an Activist’s Pricey House News? Facebook Alone Decides.

The Post’s editorial board wrote that Facebook and other social media companies “claim to be ‘neutral’ and that they aren’t making editorial decisions in a cynical bid to stave off regulation or legal accountability that threatens their profits. But they do act as publishers — just very bad ones.”

Of course, it takes one to know one. The Post, always a mix of strong local news, great gossip and spun-up conservative politics, is making a bid for the title of worst newspaper in America right now. It has run a string of scary stories about Covid vaccines, the highlight of which was a headline linking vaccines to herpes, part of a broader attempt to extend its digital reach. Great stuff, if you’re mining for traffic in anti-vax Telegram groups. The piece on the Black Lives Matter activist that Facebook blocked was pretty weak, too. It insinuated, without evidence, that her wealth was ill-gotten, and mostly just sneered at how “the self-described Marxist last month purchased a $1.4 million home.”

But then, you’ve probably hate-read a story about a person you disliked buying an expensive house. When Lachlan Murdoch, the co-chairman of The Post’s parent company, bought the most expensive house in Los Angeles, for instance, it received wide and occasionally sneering coverage. Maybe Mr. Murdoch didn’t know he could get the stories deleted by Facebook.

Facebook doesn’t keep a central register of news articles it expunges on these grounds, though the service did block a Daily Mail article about the Black Lives Matter activist’s real estate as well. And it does not keep track of how many news articles it has blocked, though it regularly deletes offending posts by individuals, including photos of the home of the Fox News star Tucker Carlson, a Facebook employee said.

What Facebook’s clash with The Post really revealed — and what surprised me — is that the platform does not defer, at all, to news organizations on questions of news judgment. A decision by The Post, or The New York Times, that someone’s personal wealth is newsworthy carries no weight in the company’s opaque enforcement mechanisms. Nor, Facebook’s lawyer said, does a more nebulous and reasonable human judgment that the country has felt on edge for the last year and that a Black activist’s concern for her own safety was justified. (The activist didn’t respond to my inquiry but, in an Instagram post, called the reporting on her personal finances “doxxing” and a “tactic of terror.”)

The point of Facebook’s bureaucracy is to replace human judgment with a kind of strict corporate law. “The policy in this case prioritizes safety and privacy, and this enforcement shows how difficult these trade-offs can be,” the company’s vice president for communications, Tucker Bounds, said. “To help us understand if our policies are in the right place, we are referring the policy to the Oversight Board.”

The board is a promising kind of supercourt that has yet to set much meaningful policy. So this rule could eventually change. (Get your stories deleted while you can!)

View

For Political Cartoonists, the Irony Was That Facebook Didn’t Recognize Irony

SAN FRANCISCO — Since 2013, Matt Bors has made a living as a left-leaning cartoonist on the internet. His site, The Nib, runs cartoons from him and other contributors that regularly skewer right-wing movements and conservatives with political commentary steeped in irony.

One cartoon in December took aim at the Proud Boys, a far-right extremist group. With tongue planted firmly in cheek, Mr. Bors titled it “Boys Will Be Boys” and depicted a recruitment where new Proud Boys were trained to be “stabby guys” and to “yell slurs at teenagers” while playing video games.

Days later, Facebook sent Mr. Bors a message saying that it had removed “Boys Will Be Boys” from his Facebook page for “advocating violence” and that he was on probation for violating its content policies.

It wasn’t the first time that Facebook had dinged him. Last year, the company briefly took down another Nib cartoon — an ironic critique of former President Donald J. Trump’s pandemic response, the substance of which supported wearing masks in public — for “spreading misinformation” about the coronavirus. Instagram, which Facebook owns, removed one of his sardonic antiviolence cartoons in 2019 because, the photo-sharing app said, it promoted violence.

Facebook barred Mr. Trump from posting on its site altogether after he incited a crowd that stormed the U.S. Capitol.

At the same time, misinformation researchers said, Facebook has had trouble identifying the slipperiest and subtlest of political content: satire. While satire and irony are common in everyday speech, the company’s artificial intelligence systems — and even its human moderators — can have difficulty distinguishing them. That’s because such discourse relies on nuance, implication, exaggeration and parody to make a point.

That means Facebook has sometimes misunderstood the intent of political cartoons, leading to takedowns. The company has acknowledged that some of the cartoons it expunged — including those from Mr. Bors — were removed by mistake and later reinstated them.

“If social media companies are going to take on the responsibility of finally regulating incitement, conspiracies and hate speech, then they are going to have to develop some literacy around satire,” Mr. Bors, 37, said in an interview.

accused Facebook and other internet platforms of suppressing only right-wing views.

In a statement, Facebook did not address whether it has trouble spotting satire. Instead, the company said it made room for satirical content — but only up to a point. Posts about hate groups and extremist content, it said, are allowed only if the posts clearly condemn or neutrally discuss them, because the risk for real-world harm is otherwise too great.

Facebook’s struggles to moderate content across its core social network, Instagram, Messenger and WhatsApp have been well documented. After Russians manipulated the platform before the 2016 presidential election by spreading inflammatory posts, the company recruited thousands of third-party moderators to prevent a recurrence. It also developed sophisticated algorithms to sift through content.

Facebook also created a process so that only verified buyers could purchase political ads, and instituted policies against hate speech to limit posts that contained anti-Semitic or white supremacist content.

Last year, Facebook said it had stopped more than 2.2 million political ad submissions that had not yet been verified and that targeted U.S. users. It also cracked down on the conspiracy group QAnon and the Proud Boys, removed vaccine misinformation, and displayed warnings on more than 150 million pieces of content viewed in the United States that third-party fact checkers debunked.

But satire kept popping up as a blind spot. In 2019 and 2020, Facebook often dealt with far-right misinformation sites that used “satire” claims to protect their presence on the platform, Mr. Brooking said. For example, The Babylon Bee, a right-leaning site, frequently trafficked in misinformation under the guise of satire.

whose independent work regularly appears in North American and European newspapers.

When Prime Minister Benjamin Netanyahu said in 2019 that he would bar two congresswomen — critics of Israel’s treatment of Palestinians — from visiting the country, Mr. Hall drew a cartoon showing a sign affixed to barbed wire that read, in German, “Jews are not welcome here.” He added a line of text addressing Mr. Netanyahu: “Hey Bibi, did you forget something?”

Mr. Hall said his intent was to draw an analogy between how Mr. Netanyahu was treating the U.S. representatives and Nazi Germany. Facebook took the cartoon down shortly after it was posted, saying it violated its standards on hate speech.

“If algorithms are making these decisions based solely upon words that pop up on a feed, then that is not a catalyst for fair or measured decisions when it comes to free speech,” Mr. Hall said.

Adam Zyglis, a nationally syndicated political cartoonist for The Buffalo News, was also caught in Facebook’s cross hairs.

paid memberships to The Nib and book sales on his personal site, he gets most of his traffic and new readership through Facebook and Instagram.

The takedowns, which have resulted in “strikes” against his Facebook page, could upend that. If he accumulates more strikes, his page could be erased, something that Mr. Bors said would cut 60 percent of his readership.

“Removing someone from social media can end their career these days, so you need a process that distinguishes incitement of violence from a satire of these very groups doing the incitement,” he said.

Mr. Bors said he had also heard from the Proud Boys. A group of them recently organized on the messaging chat app Telegram to mass-report his critical cartoons to Facebook for violating the site’s community standards, he said.

“You just wake up and find you’re in danger of being shut down because white nationalists were triggered by your comic,” he said

Facebook has sometimes recognized its errors and corrected them after he has made appeals, Mr. Bors said. But the back-and-forth and the potential for expulsion from the site have been frustrating and made him question his work, he said.

“Sometimes I do think about if a joke is worth it, or if it’s going to get us banned,” he said. “The problem with that is, where is the line on that kind of thinking? How will it affect my work in the long run?”

Cade Metz contributed reporting.

View