barred Mr. Trump from its platforms after the riot at the U.S. Capitol on Jan. 6, 2021, has worked over the years to limit political falsehoods on its sites. Tom Reynolds, a Meta spokesman, said the company had “taken a comprehensive approach to how elections play out on our platforms since before the U.S. 2020 elections and through the dozens of global elections since then.”

recently raised doubts about the country’s electoral process. Latvia, Bosnia and Slovenia are also holding elections in October.

“People in the U.S. are almost certainly getting the Rolls-Royce treatment when it comes to any integrity on any platform, especially for U.S. elections,” said Sahar Massachi, the executive director of the think tank Integrity Institute and a former Facebook employee. “And so however bad it is here, think about how much worse it is everywhere else.”

Facebook’s role in potentially distorting elections became evident after 2016, when Russian operatives used the site to spread inflammatory content and divide American voters in the U.S. presidential election. In 2018, Mr. Zuckerberg testified before Congress that election security was his top priority.

banning QAnon conspiracy theory posts and groups in October 2020.

Around the same time, Mr. Zuckerberg and his wife, Priscilla Chan, donated $400 million to local governments to fund poll workers, pay for rental fees for polling places, provide personal protective equipment and cover other administrative costs.

The week before the November 2020 election, Meta also froze all political advertising to limit the spread of falsehoods.

But while there were successes — the company kept foreign election interference off the platform — it struggled with how to handle Mr. Trump, who used his Facebook account to amplify false claims of voter fraud. After the Jan. 6 riot, Facebook barred Mr. Trump from posting. He is eligible for reinstatement in January.

Frances Haugen, a Facebook employee turned whistle-blower, filed complaints with the Securities and Exchange Commission accusing the company of removing election safety features too soon after the 2020 election. Facebook made growth and engagement its priorities over security, she said.

fully realized digital world that exists beyond the one in which we live. It was coined by Neal Stephenson in his 1992 novel “Snow Crash,” and the concept was further explored by Ernest Cline in his novel “Ready Player One.”

Mr. Zuckerberg no longer meets weekly with those focused on election security, said the four employees, though he receives their reports. Instead, they meet with Nick Clegg, Meta’s president of global affairs.

Several civil right groups said they had noticed Meta’s shift in priorities. Mr. Zuckerberg isn’t involved in discussions with them as he once was, nor are other top Meta executives, they said.

“I’m concerned,” said Derrick Johnson, president of the National Association for the Advancement of Colored People, who talked with Mr. Zuckerberg and Sheryl Sandberg, Meta’s chief operating officer, ahead of the 2020 election. “It appears to be out of sight, out of mind.” (Ms. Sandberg has announced that she will leave Meta this fall.)

wrote a letter to Mr. Zuckerberg and the chief executives of YouTube, Twitter, Snap and other platforms. They called for them to take down posts about the lie that Mr. Trump won the 2020 election and to slow the spread of election misinformation before the midterms.

Yosef Getachew, a director at the nonprofit public advocacy organization Common Cause, whose group studied 2020 election misinformation on social media, said the companies had not responded.

“The Big Lie is front and center in the midterms with so many candidates using it to pre-emptively declare that the 2022 election will be stolen,” he said, pointing to recent tweets from politicians in Michigan and Arizona who falsely said dead people cast votes for Democrats. “Now is not the time to stop enforcing against the Big Lie.”

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

The Lies Putin Tells to Justify Russia’s War on Ukraine

In the tense weeks before Russia invaded Ukraine on Feb. 24, Russian officials denied that it planned anything of the sort, denouncing the United States and its NATO allies for stoking panic and anti-Russian hatred. When it did invade, the officials denied it was at war.

Since then, the Kremlin has cycled through a torrent of lies to explain why it had to wage a “special military operation” against a sovereign neighbor. Drug-addled neo-Nazis. Genocide. American biological weapons factories. Birds and reptiles trained to carry pathogens into Russia. Ukrainian forces bombing their own cities, including theaters sheltering children.

Disinformation in wartime is as old as war itself, but today war unfolds in the age of social media and digital diplomacy. That has given Russia — and its allies in China and elsewhere — powerful means to prop up the claim that the invasion is justified, exploiting disinformation to rally its citizens at home and to discredit its enemies abroad. Truth has simply become another front in Russia’s war.

Using a barrage of increasingly outlandish falsehoods, President Vladimir V. Putin has created an alternative reality, one in which Russia is at war not with Ukraine but with a larger, more pernicious enemy in the West. Even since the war began, the lies have gotten more and more bizarre, transforming from claims that “true sovereignty” for Ukraine was possible only under Russia, made before the attacks, to those about migratory birds carrying bioweapons.

reaching audiences that were once harder to reach.

“Previously, if you were sitting in Moscow and you wanted to reach audiences sitting in, say, Idaho, you would have to work really hard doing that,” said Elise Thomas, a researcher in Australia for the Institute of Strategic Dialogue, referring to disinformation campaigns dating to the Soviet Union. “It would take you time to set up the systems, whereas now you can do it with the press of a button.”

The power of Russia’s claim that the invasion is justified comes not from the veracity of any individual falsehood meant to support it but from the broader argument. Individual lies about bioweapons labs or crisis actors are advanced by Russia as swiftly as they are debunked, with little consistency or logic between them. But supporters stubbornly cling to the overarching belief that something is wrong in Ukraine and Russia will fix it. Those connections prove harder to shake, even as new evidence is introduced.

That mythology, and its resilience in the face of fact-checking and criticism, reflects “the ability of autocrats and malign actors to completely brainwash us to the point where we don’t see what’s in front of us,” said Laura Thornton, the director and senior fellow at the German Marshall Fund’s Alliance for Securing Democracy.

The Kremlin’s narratives today feed on pre-existing views of the war’s root causes, which Mr. Putin has nurtured for years — and restated in increasingly strident language last week.

President Volodymyr Zelensky himself, whose video messages to Ukrainians and the world have combined bravery with the stage presence of the television performer he once was.

Russia, though, has more tools and reach, and it has the upper hand with weaponry. The strategy has been to overwhelm the information space, especially at home, which “is really where their focus is,” said Peter Pomerantsev, a scholar at the Stavros Niarchos Foundation Agora Institute at Johns Hopkins University who has written extensively about Russian propaganda.

Russia’s propaganda machine plays into suspicion of the West and NATO, which have been vilified on state television for years, deeply embedding distrust in Russian society. State media has also more recently echoed beliefs advanced by the QAnon movement, which ascribes the world’s problems largely to global elites and sex traffickers.

Those beliefs make people feel “scared and uncertain and alienated,” said Sophia Moskalenko, a social psychologist at Georgia State University. “As a result of manipulating their emotions, they will be more likely to embrace conspiracy theories.”

Mr. Putin’s public remarks, which dominate state media, have become increasingly strident. He has warned that nationalist sentiment in Ukraine is a threat to Russia itself, as is NATO expansion.

swiftly to silence dissenting points of view that could cut through the fog of war and discourage the Russian population.

For now, the campaign appears to have rallied public opinion behind Mr. Putin, according to most surveys in Russia, though not as high as might be expected for a country at war.

“My impression is that many people in Russia are buying the government’s narrative,” said Alexander Gabuev, a senior fellow at the Carnegie Moscow Center. “They have doctored images on state-controlled media. Private media don’t cover the war, fearing 15 years in prison. Same goes for people on the social media. Russia has lost information warfare globally, but the regime is quite successful at home.”

appeared in the information fortress the Kremlin is building.

A week after the invasion began, when it was already clear the war was going badly for Russian troops, Mr. Putin rushed to enact a law that punishes “fake news” with up to 15 years in prison. Media regulators warned broadcasters not to refer to the war as a war. They also forced off the air two flagships of independent media — Ekho Moskvy, a liberal radio station, and Dozhd, a television station — that gave voice to the Kremlin’s opponents.

Access to Facebook, Twitter, TikTok and most recently Instagram has also been severed inside Russia — all platforms the country’s diplomats have continued to use outside to misinform. Once spread, disinformation can be tenacious, even in places with a free press and open debate, like the United States, where polls suggest that more than 40 percent of the population believes the 2020 election was stolen from former President Donald J. Trump.

“Why are people so surprised that this kind of widespread disinformation can be so effective in Russia when it was so effective here?” Ms. Thornton of the German Marshall Fund said.

As the war in Ukraine drags on, however, casualties are mounting, confronting families in Russia with the loss of fathers and sons. That could test how persuasive the Kremlin’s information campaign truly is.

The Soviet Union sought to keep a similar veil of silence around its decade-long quagmire in Afghanistan in the 1980s, but the truth seeped into public consciousness anyway, eroding the foundation of the entire system. Two years after the last troops pulled out in 1989, the Soviet Union itself collapsed.

Claire Fu contributed research.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Peter Thiel, the Right’s Would-Be Kingmaker

Mr. Thiel has attracted the most attention for two $10 million donations to the Senate candidates Blake Masters in Arizona and J.D. Vance in Ohio. Like Mr. Thiel, the men are tech investors with pedigrees from elite universities who cast themselves as antagonists to the establishment. They have also worked for the billionaire and been financially dependent on him. Mr. Masters, the chief operating officer of Thiel Capital, the investor’s family office, has promised to leave that job before Arizona’s August primary.

Mr. Thiel, who declined to comment for this article, announced last week that he would leave the board of Meta, the parent company of Facebook, which conservatives have accused of censorship. One reason for the change: He plans to focus more on politics.

Born in West Germany and raised in South Africa and the San Francisco Bay Area, Mr. Thiel showed his provocative side at Stanford in the late 1980s. Classmates recalled Mr. Thiel, who studied philosophy and law, describing South Africa’s apartheid as a sound economic system. (A spokesman for Mr. Thiel has denied that he supported apartheid.)

Mr. Thiel also helped found The Stanford Review, a conservative campus paper that sought to provide “alternative views” to what he deemed left-wing orthodoxy.

In 1995, he co-wrote a book, “The Diversity Myth,” arguing that “the extreme focus on racism” had caused greater societal tension and acrimony. Rape, he and his co-author, David Sacks, wrote, sometimes included “seductions that are later regretted.” (Mr. Thiel has apologized for the book.)

In 1998, Mr. Thiel helped create what would become the digital payments company PayPal. He became Facebook’s first outside investor in 2004 and established the venture capital firm Founders Fund a year later. Forbes puts his fortune at $2.6 billion.

one 2009 piece, Mr. Thiel, who called himself a libertarian, wrote that he had come to “no longer believe that freedom and democracy are compatible,” arguing that American politics would always be hostile to free-market ideals, and that politics was about interfering with other people’s lives without their consent. Since then, he has hosted and attended events with white nationalists and alt-right figures.

His political giving evolved with those views. He donated lavishly to Ron Paul’s 2008 and 2012 presidential campaigns before turning to candidates who were more extreme than the Republican establishment.

In 2013, Curtis Yarvin, an entrepreneur who has voiced racist beliefs and said democracy was a destructive system of government, emailed Mr. Thiel. Mr. Yarvin wrote that Mr. Cruz, then a newly elected senator, “needs to purge every single traitor” from the Republican Party. In the email, which The Times obtained, Mr. Yarvin argued that it didn’t matter if those candidates lost general elections or cost the party control in Congress.

Mr. Thiel, who had donated to Mr. Cruz’s 2012 campaign, replied, “It’s relatively safe to support Cruz (for me) because he threatens the Republican establishment.”

Mr. Thiel used his money to fund other causes. In 2016, he was revealed as the secret funder of a lawsuit that targeted Gawker Media, which had reported he was gay. Gawker declared bankruptcy, partly from the costs of fighting the lawsuit.

proud to be a gay Republican supporting Mr. Trump. He later donated $1.25 million to the candidate.

After Mr. Trump won, Mr. Thiel was named to the president-elect’s executive transition team. At a meeting with tech leaders at Trump Tower in Manhattan in December 2016, Mr. Trump told Mr. Thiel, “You’re a very special guy.”

A month later, Mr. Thiel, a naturalized American, was revealed to have also obtained citizenship in New Zealand. That prompted a furor, especially after Mr. Trump had urged people to pledge “total allegiance to the United States.”

During Mr. Trump’s presidency, Mr. Thiel became frustrated with the administration. “There are all these ways that things have fallen short,” he told The Times in 2018.

In 2020, he stayed on the sidelines. His only notable federal election donation was to Kris Kobach, a Trump ally and former secretary of state of Kansas known for his hard-line views on immigration. (Mr. Kobach lost his primary bid for the Senate.)

Mr. Thiel’s personal priorities also changed. In 2016, he announced that he was moving from San Francisco to Los Angeles. The next year, he married a longtime boyfriend, Matt Danzeisen; they have two children.

Mr. Thiel reduced his business commitments and started pondering leaving Meta’s board, which he had joined in 2005, two of the people with knowledge of his thinking said. At an October event held by a conservative tech group in Miami, he alluded to his frustration with Facebook, which was increasingly removing certain kinds of speech and had barred Mr. Trump.

a $13 million mansion in Washington from Wilbur Ross, Mr. Trump’s commerce secretary. In October, he spoke at the event for the Federalist Society at Stanford and at the National Conservatism Conference.

He also rebuilt his relationship with Mr. Trump. Since the 2020 election, they have met at least three times in New York and at Mar-a-Lago, sometimes with Mr. Masters or Mr. Vance. And Mr. Thiel invested in Mr. McEntee’s company, which is building a dating app for conservatives called the RightStuff.

Mr. McEntee declined to answer questions about his app and said Mr. Thiel was “a great guy.” Mr. Trump’s representatives did not respond to requests for comment.

Mr. Thiel’s political giving ramped up last spring with his $10 million checks to PACs supporting Mr. Vance and Mr. Masters. The sums were his biggest and the largest ever one-time contributions to a PAC backing a single candidate, according to OpenSecrets.

Like Mr. Trump in 2016, Mr. Vance and Mr. Masters lack experience in politics. Mr. Vance, the venture capitalist who wrote the best-selling memoir “Hillbilly Elegy,” met Mr. Thiel a decade ago when the billionaire delivered a lecture at Yale Law School, where Mr. Vance was a student.

Zero to One.” In 2020, Mr. Masters reported more than $1.1 million in salary from Thiel Capital and book royalties.

Mr. Vance, Mr. Masters and their campaigns did not respond to requests for comment.

Both candidates have repeated the Trumpian lie of election fraud, with Mr. Masters stating in a November campaign ad, “I think Trump won in 2020.” They have also made Mr. Thiel a selling point in their campaigns.

In November, Mr. Vance wrote on Twitter that anyone who donated $10,800 to his campaign could attend a small group dinner with him and Mr. Thiel. Mr. Masters offered the same opportunity for a meal with Mr. Thiel and raised $550,000 by selling nonfungible tokens, or NFTs, of “Zero to One” digital art that would give holders “access to parties with me and Peter.”

a 20-minute speech at the National Conservatism Conference in October, he said nationalism was “a corrective” to the “brain-dead, one-world state” of globalism. He also blasted the Biden administration.

“We have the zombie retreads just busy rearranging the deck chairs,” he said. “We need dissident voices more than ever.”

Cade Metz contributed reporting. Rachel Shorey and Kitty Bennett contributed research.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook Debates What to Do With Its Like and Share Buttons

SAN FRANCISCO — In 2019, Facebook researchers began a new study of one of the social network’s foundational features: the Like button.

They examined what people would do if Facebook removed the distinct thumbs-up icon and other emoji reactions from posts on its photo-sharing app Instagram, according to company documents. The buttons had sometimes caused Instagram’s youngest users “stress and anxiety,” the researchers found, especially if posts didn’t get enough Likes from friends.

But the researchers discovered that when the Like button was hidden, users interacted less with posts and ads. At the same time, it did not alleviate teenagers’ social anxiety and young users did not share more photos, as the company thought they might, leading to a mixed bag of results.

Mark Zuckerberg, Facebook’s chief executive, and other managers discussed hiding the Like button for more Instagram users, according to the documents. In the end, a larger test was rolled out in just a limited capacity to “build a positive press narrative” around Instagram.

misinformation, privacy and hate speech, a central issue has been whether the basic way that the platform works has been at fault — essentially, the features that have made Facebook be Facebook.

Apart from the Like button, Facebook has scrutinized its share button, which lets users instantly spread content posted by other people; its groups feature, which is used to form digital communities; and other tools that define how more than 3.5 billion people behave and interact online. The research, laid out in thousands of pages of internal documents, underlines how the company has repeatedly grappled with what it has created.

What researchers found was often far from positive. Time and again, they determined that people misused key features or that those features amplified toxic content, among other effects. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” — meaning the basics of how the product functioned — that had let misinformation and hate speech flourish on the site.

“The mechanics of our platform are not neutral,” they concluded.

hide posts they do not want to see and turning off political group recommendations to reduce the spread of misinformation.

But the core way that Facebook operates — a network where information can spread rapidly and where people can accumulate friends and followers and Likes — ultimately remains largely unchanged.

Many significant modifications to the social network were blocked in the service of growth and keeping users engaged, some current and former executives said. Facebook is valued at more than $900 billion.

“There’s a gap between the fact that you can have pretty open conversations inside of Facebook as an employee,” said Brian Boland, a Facebook vice president who left last year. “Actually getting change done can be much harder.”

The company documents are part of the Facebook Papers, a cache provided to the Securities and Exchange Commission and to Congress by a lawyer representing Frances Haugen, a former Facebook employee who has become a whistle-blower. Ms. Haugen earlier gave the documents to The Wall Street Journal. This month, a congressional staff member supplied the redacted disclosures to more than a dozen other news organizations, including The New York Times.

In a statement, Andy Stone, a Facebook spokesman, criticized articles based on the documents, saying that they were built on a “false premise.”

“Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie,” he said. He said Facebook had invested $13 billion and hired more than 40,000 people to keep people safe, adding that the company has called “for updated regulations where democratic governments set industry standards to which we can all adhere.”

post this month, Mr. Zuckerberg said it was “deeply illogical” that the company would give priority to harmful content because Facebook’s advertisers don’t want to buy ads on a platform that spreads hate and misinformation.

“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote.

When Mr. Zuckerberg founded Facebook 17 years ago in his Harvard University dorm room, the site’s mission was to connect people on college campuses and bring them into digital groups with common interests and locations.

Growth exploded in 2006 when Facebook introduced the News Feed, a central stream of photos, videos and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.

In 2009, Facebook introduced the Like button. The tiny thumbs-up symbol, a simple indicator of people’s preferences, became one of the social network’s most important features. The company allowed other websites to adopt the Like button so users could share their interests back to their Facebook profiles.

That gave Facebook insight into people’s activities and sentiments outside of its own site, so it could better target them with advertising. Likes also signified what users wanted to see more of in their News Feeds so people would spend more time on Facebook.

Facebook also added the groups feature, where people join private communication channels to talk about specific interests, and pages, which allowed businesses and celebrities to amass large fan bases and broadcast messages to those followers.

Adam Mosseri, the head of Instagram, has said that research on users’ well-being led to investments in anti-bullying measures on Instagram.

Yet Facebook cannot simply tweak itself so that it becomes a healthier social network when so many problems trace back to core features, said Jane Lytvynenko, a senior fellow at the Harvard Kennedy Shorenstein Center, who studies social networks and misinformation.

“When we talk about the Like button, the share button, the News Feed and their power, we’re essentially talking about the infrastructure that the network is built on top of,” she said. “The crux of the problem here is the infrastructure itself.”

As Facebook’s researchers dug into how its products worked, the worrisome results piled up.

In a July 2019 study of groups, researchers traced how members in those communities could be targeted with misinformation. The starting point, the researchers said, were people known as “invite whales,” who sent invitations out to others to join a private group.

These people were effective at getting thousands to join new groups so that the communities ballooned almost overnight, the study said. Then the invite whales could spam the groups with posts promoting ethnic violence or other harmful content, according to the study.

Another 2019 report looked at how some people accrued large followings on their Facebook pages, often using posts about cute animals and other innocuous topics. But once a page had grown to tens of thousands of followers, the founders sold it. The buyers then used the pages to show followers misinformation or politically divisive content, according to the study.

As researchers studied the Like button, executives considered hiding the feature on Facebook as well, according to the documents. In September 2019, it removed Likes from users’ Facebook posts in a small experiment in Australia.

The company wanted to see if the change would reduce pressure and social comparison among users. That, in turn, might encourage people to post more frequently to the network.

But people did not share more posts after the Like button was removed. Facebook chose not to roll the test out more broadly, noting, “Like counts are extremely low on the long list of problems we need to solve.”

Last year, company researchers also evaluated the share button. In a September 2020 study, a researcher wrote that the button and so-called reshare aggregation units in the News Feed, which are automatically generated clusters of posts that have already been shared by people’s friends, were “designed to attract attention and encourage engagement.”

But gone unchecked, the features could “serve to amplify bad content and sources,” such as bullying and borderline nudity posts, the researcher said.

That’s because the features made people less hesitant to share posts, videos and messages with one another. In fact, users were three times more likely to share any kind of content from the reshare aggregation units, the researcher said.

One post that spread widely this way was an undated message from an account called “The Angry Patriot.” The post notified users that people protesting police brutality were “targeting a police station” in Portland, Ore. After it was shared through reshare aggregation units, hundreds of hate-filled comments flooded in. It was an example of “hate bait,” the researcher said.

A common thread in the documents was how Facebook employees argued for changes in how the social network worked and often blamed executives for standing in the way.

In an August 2020 internal post, a Facebook researcher criticized the recommendation system that suggests pages and groups for people to follow and said it can “very quickly lead users down the path to conspiracy theories and groups.”

“Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms,” the researcher wrote. “During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole” of conspiracy theory movements like QAnon and anti-vaccination and Covid-19 conspiracies.

The researcher added, “It has been painful to observe.”

Reporting was contributed by Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

What Happened When Facebook Employees Warned About Election Misinformation

WHAT HAPPENED

1. From Wednesday through Saturday there was a lot of content circulating which implied fraud in the election, at around 10% of all civic content and 1-2% of all US VPVs. There was also a fringe of incitement to violence.

2. There were dozens of employees monitoring this, and FB launched ~15 measures prior to the election, and another ~15 in the days afterwards. Most of the measures made existings processes more aggressive: e.g. by lowering thresholds, by making penalties more severe, or expanding eligibility for existing measures. Some measures were qualitative: reclassifying certain types of content as violating, which had not been before.

3. I would guess these measures reduced prevalence of violating content by at least 2X. However they had collateral damage (removing and demoting non-violating content), and the episode caused noticeable resentment by Republican Facebook users who feel they are being unfairly targeted.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook, Fearing Public Outcry, Shelved Earlier Report on Popular Posts

When Facebook this week released its first quarterly report about the most viewed posts in the United States, Guy Rosen, its vice president of integrity, said the social network had undertaken “a long journey” to be “by far the most transparent platform on the internet.” The list showed that the posts with the most reach tended to be innocuous content like recipes and cute animals.

Facebook had prepared a similar report for the first three months of the year, but executives never shared it with the public because of concerns that it would look bad for the company, according to internal emails sent by executives and shared with The New York Times.

In that report, a copy of which was provided to The Times, the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.

The report was nearing public release when some executives, including Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, debated whether it would cause a public relations problem, according to the internal emails. The company decided to shelve it.

called on the company to share more information about false and misleading information on the site, and to do a better job of stopping its spread. Last month, President Biden accused the company of “killing people” by allowing false information to circulate widely, a statement the White House later softened. Other federal agencies have accused Facebook of withholding key data.

Facebook has pushed back, publicly accusing the White House of scapegoating the company for the administration’s failure to reach its vaccination goals. Executives at Facebook, including Mark Zuckerberg, its chief executive, have said the platform has been aggressively removing Covid-19 misinformation since the start of the pandemic. The company said it had removed over 18 million pieces of misinformation in that period.

But Brian Boland, a former vice president of product marketing at Facebook, said there was plenty of reason to be skeptical about data collected and released by a company that has had a history of protecting its own interests.

barred from advertising on Facebook because of its repeated violations of the platform’s political advertising policy.

Trending World, according to the report, was viewed by 81.4 million accounts, slightly fewer than the 18th-most-popular page, Fox News, which had 81.7 million content viewers for the first three months of 2021.

Facebook’s transparency report released on Wednesday also showed that an Epoch Times subscription link was among the most viewed in the United States. With some 44.2 million accounts seeing the link in April, May and June, it was about half as popular as Trending World in the shelved report.

Sheera Frenkel and Mike Isaac contributed reporting. Jacob Silver and Ben Decker contributed research.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Germany Looks Into Covid Deniers’ Links With Far Right

The German domestic intelligence agency is keeping close tabs on a group of coronavirus deniers, who, in their protests against restrictions and tendency to believe in conspiracy theories, have found common cause with far-right extremists.

Government officials said the movement’s close ties to extremist organizations, such as the “Reich citizens” — or “Reichsbürger,” as they are known in German, referring to a group that refuses to accept the legitimacy of the modern German state — were troubling. Many of the coronavirus deniers say they also believe in QAnon conspiracy theories, and protesters are frequently seen holding signs with anti-Semitic tropes. A number of journalists have been attacked while covering the demonstrations.

A spokesman for the Interior Ministry said in a statement, “Our basic democratic order, as well as state institutions such as parliaments and governments, have faced multiple attacks since the beginning of the measures to contain the Covid-19 pandemic.” Several regional intelligence agencies have already been observing participants in the movement, he added.

The group of deniers, which started as a fringe movement last spring, has grown into a coordinated effort that organizes mass demonstrations across Germany. The rallies occasionally turn aggressive, and many have ended in scuffles with law enforcement officers.

AfD, a German right-wing populist party, have allied themselves with protesters. The national intelligence agency’s formal observation of the deniers’ group is the first step in a procedure that could lead to it being declared anti-constitutional and ultimately banned.

A week ago, about 8,000 people in Berlin protested the passing of a law that gives the federal government power to implement tougher restrictions. Germany has seen a persistently high average number of new daily cases recently, averaging about 18,000 a day, according to a New York Times database, up from about 8,000 a day two months ago.

View Source

Extremists Find a Financial Lifeline on Twitch

Terpsichore Maras-Lindeman, a podcaster who fought to overturn the 2020 presidential election, recently railed against mask mandates to her 4,000 fans in a live broadcast and encouraged them to enter stores maskless. On another day, she grew emotional while thanking them for sending her $84,000.

Millie Weaver, a former correspondent for the conspiracy theory website Infowars, speculated on her channel that coronavirus vaccines could be used to surveil people. Later, she plugged her merchandise store, where she sells $30 “Drain the Swamp” T-shirts and hats promoting conspiracies.

And a podcaster who goes by Zak Paine or Redpill78, who pushes the baseless QAnon conspiracy theory, urged his viewers to donate to the congressional campaign of an Ohio man who has said he attended the “Stop the Steal” rally in Washington on Jan. 6.

Facebook, YouTube and other social media platforms clamped down on misinformation and hate speech ahead of the 2020 election.

apps like Google Podcasts, where far-right influencers have scattered as their options for spreading falsehoods have dwindled.

Twitch became a multibillion-dollar business thanks to video gamers broadcasting their play of games like Fortnite and Call of Duty. Fans, many of whom are young men, pay the gamers by subscribing to their channels or donating money. Streamers earn even more by sending their fans to outside sites to either buy merchandise or donate money.

Now Twitch has also become a place where right-wing personalities spread election and vaccine conspiracy theories, often without playing any video games. It is part of a shift at the platform, where streamers have branched out from games into fitness, cooking, fishing and other lifestyle topics in recent years.

But unlike fringe livestreaming sites like Dlive and Trovo, which have also offered far-right personalities moneymaking opportunities, Twitch attracts far larger audiences. On average, 30 million people visit the site each day, the platform said.

stricter rules than other social media platforms for the kinds of views that users can express. It temporarily suspended Mr. Trump’s account for “hateful conduct” last summer, months before Facebook and Twitter made similar moves. Its community guidelines prohibit hateful conduct and harassment. Ms. Clemens said Twitch was developing a misinformation policy.

This month, Twitch announced a policy that would allow it to suspend the accounts of people who committed crimes or severe offenses in real life or on other social media platforms, including violent extremism or membership in a known hate group. Twitch said it did not consider QAnon to be a hate group.

Despite all this, a Twitch channel belonging to Enrique Tarrio, the leader of the Proud Boys, a white nationalist organization, remained online until the middle of this month after The New York Times inquired about it. And the white nationalist Anthime Joseph Gionet, known as Baked Alaska, had a Twitch channel for months, even though he was arrested in January by the F.B.I. and accused of illegally storming the U.S. Capitol on Jan. 6. Twitch initially said his activities had not violated the platform’s policies, then barred him this month for hateful conduct.

has said is dangerous. Last week, he referred to a QAnon belief that people are killing children to “harvest” a chemical compound from them, then talked about a “criminal cabal” controlling the government, saying people do not understand “what plane of existence they come from.”

Mr. Paine, who is barred from Twitter and YouTube, has also asked his Twitch audience to donate to the House campaign of J.R. Majewski, an Air Force veteran in Toledo, Ohio, who attracted attention last year for painting his lawn to look like a Trump campaign banner. Mr. Majewski has used QAnon hashtags but distanced himself from the movement in an interview with his local newspaper, The Toledo Blade.

Mr. Majewski has appeared on Mr. Paine’s streams, where they vape, chat about Mr. Majewski’s campaign goals and take calls from listeners.

“He is exactly the type of person that we need to get in Washington, D.C., so that we can supplant these evil cabal criminal actors and actually run our own country,” Mr. Paine said on one stream.

Neither Mr. Paine nor Mr. Majewski responded to a request for comment.

Joan Donovan, a Harvard University researcher who studies disinformation and online extremism, said streamers who rely on their audience’s generosity to fund themselves felt pressured to continue raising the stakes.

“The incentive to lie, cheat, steal, hoax and scam is very high when the cash is easy to acquire,” she said.

View Source

Catch up: Jack Dorsey says Twitter played a role in U.S. Capitol riot.

“I don’t think anyone wants a world where you can only say things that private companies judge to be true.” “Our mission is to organize the world’s information, and make it universally accessible and useful.” “We believe in free debate and conversation to find the truth. At the same time, we must balance that with our desire for our service not to be used to sow confusion, division or destruction.” “There are two faces to each of your platforms. Facebook has family and friends, neighborhood, but it is right next to the one where there is a white nationalist rally every day. YouTube is a place where people share quirky videos, but down the street, anti-vaxxers Covid deniers, QAnon supporters and Flat Earthers are sharing videos.” “You’ve failed to meaningfully change after your platform has played a role in fomenting insurrection, and abetting the spread of the virus and trampling American civil liberties. And while it may be true that some bad actors will shout ‘fire’ in the crowded theater by promoting harmful content, your platforms are handing them a megaphone to be heard in every theater across the country and the world. Your business model itself has become the problem.” “How is it possible for you not to at least admit that Facebook played a central role or a leading role in facilitating the recruitment, planning and execution of the attack on the Capitol?” “Chairman, my point is that I think that the responsibility here lies with the people who took the actions to break the law, and take and do the insurrection and secondarily, also the people who spread that content, including the president, but others as well.” “Your platform bears some responsibility for disseminating disinformation related to the election and the ‘Stop the Steal’ movement that led to the attack on the Capitol. Just a yes or no answer.” “Congressman, it’s a complex question. We —” “OK, we’ll move on. Mr Dorsey.” “Yes, but you also have to take into consideration a broader ecosystem. It’s not just the technology platforms we use.” “We’re all aware of big tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda by influencing a generation of children — removing, shutting down or canceling any news, books and even now, toys, that aren’t considered woke.” “First of all, do you recognize that there is a real concern, that there’s an anti-conservative bias on Twitter’s behalf? And would you recognize that this has to stop if this is going to be, Twitter is going to be viewed by both sides as a place where everybody is going to get a fair treatment?” “We don’t write policy according to any particular political leaning. If we find any of it, we route it out.”

View Source

Lawmakers Grill Tech C.E.O.s on Capitol Riot, Getting Few Direct Answers

WASHINGTON — Lawmakers grilled the leaders of Facebook, Google and Twitter on Thursday about the connection between online disinformation and the Jan. 6 riot at the Capitol, causing Twitter’s chief executive to publicly admit for the first time that his product had played a role in the events that left five people dead.

When a Democratic lawmaker asked the executives to answer with a “yes” or a “no” whether the platforms bore some responsibility for the misinformation that had contributed to the riot, Jack Dorsey of Twitter said “yes.” Neither Mark Zuckerberg of Facebook nor Sundar Pichai of Google would answer the question directly.

The roughly five-hour hearing before a House committee marked the first time lawmakers directly questioned the chief executives regarding social media’s role in the January riot. The tech bosses were also peppered with questions about how their companies helped spread falsehoods around Covid-19 vaccines, enable racism and hurt children’s mental health.

It was also the first time the executives had testified since President Biden’s inauguration. Tough questioning from lawmakers signaled that scrutiny of Silicon Valley’s business practices would not let up, and could even intensify, with Democrats in the White House and leading both chambers of Congress.

tweeted a single question mark with a poll that had two options: “Yes” or “No.” When asked about his tweet by a lawmaker, he said “yes” was winning.

The January riot at the Capitol has made the issue of disinformation deeply personal for lawmakers. The riot was fueled by false claims from President Donald J. Trump and others that the election had been stolen, which were rampant on social media.

Some of the participants had connections to QAnon and other online conspiracy theories. And prosecutors have said that groups involved in the riot, including the Oath Keepers and the Proud Boys, coordinated some of their actions on social media.

ban Mr. Trump and his associates after the Jan. 6 riots. The bans hardened views by conservatives that the companies are left-leaning and are inclined to squelch conservative voices.

“We’re all aware of Big Tech’s ever-increasing censorship of conservative voices and their commitment to serve the radical progressive agenda,” said Representative Bob Latta of Ohio, the ranking Republican on the panel’s technology subcommittee.

The company leaders defended their businesses, saying they had invested heavily in hiring content moderators and in technology like artificial intelligence, used to identify and fight disinformation.

Mr. Zuckerberg argued against the notion that his company had a financial incentive to juice its users’ attention by driving them toward more extreme content. He said Facebook didn’t design “algorithms in order to just kind of try to tweak and optimize and get people to spend every last minute on our service.”

He added later in the hearing that elections disinformation was spread in messaging apps, where amplification and algorithms don’t aid in spread of false content. He also blamed television and other traditional media for spreading election lies.

The companies showed fissures in their view on regulations. Facebook has vocally supported internet regulations in a major advertising blitz on television and in newspapers. In the hearing, Mr. Zuckerberg suggested specific regulatory reforms to a key legal shield, known as Section 230 of the Communications Decency Act, that has helped Facebook and other Silicon Valley internet giants thrive.

The legal shield protects companies that host and moderate third-party content, and says companies like Google and Twitter are simply intermediaries of their user-generated content. Democrats have argued that with that protection, companies aren’t motivated to remove disinformation. Republicans accuse the companies of using the shield to moderate too much and to take down content that doesn’t represent their political viewpoints.

“I believe that Section 230 would benefit from thoughtful changes to make it work better for people,” Mr. Zuckerberg said in the statement.

He proposed that liability protection for companies be conditional on their ability to fight the spread of certain types of unlawful content. He said platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Reforms, he said, should be different for smaller social networks, which wouldn’t have the same resources like Facebook to meet new requirements.

Mr. Pichai and Mr. Dorsey said they supported requirements of transparency in content moderation but fell short of agreeing with Mr. Zuckerberg’s other ideas. Mr. Dorsey said that it would be very difficult to distinguish a large platform from a smaller one.

Lawmakers did not appear to be won over.

“There’s a lot of smugness among you,” said Representative Bill Johnson, a Republican of Ohio. “There’s this air of untouchable-ness in your responses to many of the tough questions that you’re being asked.”

Kate Conger and Daisuke Wakabayashi contributed reporting.

View Source