landmark legislation called the Digital Services Act, which requires social media platforms like Twitter to more aggressively police their services for hate speech, misinformation and illicit content.

The new law will require Twitter and other social media companies with more than 45 million users in the European Union to conduct annual risk assessments about the spread of harmful content on their platforms and outline plans to combat the problem. If they are not seen as doing enough, the companies can be fined up to 6 percent of their global revenue, or even be banned from the European Union for repeat offenses.

Inside Twitter, frustrations have mounted over Mr. Musk’s moderation plans, and some employees have wondered if he would really halt their work during such a critical moment, when they are set to begin moderating tweets about elections in Brazil and another national election in the United States.

Adam Satariano contributed reporting.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

What Happened When Facebook Employees Warned About Election Misinformation

WHAT HAPPENED

1. From Wednesday through Saturday there was a lot of content circulating which implied fraud in the election, at around 10% of all civic content and 1-2% of all US VPVs. There was also a fringe of incitement to violence.

2. There were dozens of employees monitoring this, and FB launched ~15 measures prior to the election, and another ~15 in the days afterwards. Most of the measures made existings processes more aggressive: e.g. by lowering thresholds, by making penalties more severe, or expanding eligibility for existing measures. Some measures were qualitative: reclassifying certain types of content as violating, which had not been before.

3. I would guess these measures reduced prevalence of violating content by at least 2X. However they had collateral damage (removing and demoting non-violating content), and the episode caused noticeable resentment by Republican Facebook users who feel they are being unfairly targeted.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook Said to Consider Forming an Election Commission

Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.

The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.

Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.

Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.

pays them through a trust.

The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.

In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.

A spokesman for the Oversight Board declined to comment.

Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.

bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.

The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.

The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.

Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.

There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.

“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”

Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter.

An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.

Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.”

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook, Fearing Public Outcry, Shelved Earlier Report on Popular Posts

When Facebook this week released its first quarterly report about the most viewed posts in the United States, Guy Rosen, its vice president of integrity, said the social network had undertaken “a long journey” to be “by far the most transparent platform on the internet.” The list showed that the posts with the most reach tended to be innocuous content like recipes and cute animals.

Facebook had prepared a similar report for the first three months of the year, but executives never shared it with the public because of concerns that it would look bad for the company, according to internal emails sent by executives and shared with The New York Times.

In that report, a copy of which was provided to The Times, the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.

The report was nearing public release when some executives, including Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, debated whether it would cause a public relations problem, according to the internal emails. The company decided to shelve it.

called on the company to share more information about false and misleading information on the site, and to do a better job of stopping its spread. Last month, President Biden accused the company of “killing people” by allowing false information to circulate widely, a statement the White House later softened. Other federal agencies have accused Facebook of withholding key data.

Facebook has pushed back, publicly accusing the White House of scapegoating the company for the administration’s failure to reach its vaccination goals. Executives at Facebook, including Mark Zuckerberg, its chief executive, have said the platform has been aggressively removing Covid-19 misinformation since the start of the pandemic. The company said it had removed over 18 million pieces of misinformation in that period.

But Brian Boland, a former vice president of product marketing at Facebook, said there was plenty of reason to be skeptical about data collected and released by a company that has had a history of protecting its own interests.

barred from advertising on Facebook because of its repeated violations of the platform’s political advertising policy.

Trending World, according to the report, was viewed by 81.4 million accounts, slightly fewer than the 18th-most-popular page, Fox News, which had 81.7 million content viewers for the first three months of 2021.

Facebook’s transparency report released on Wednesday also showed that an Epoch Times subscription link was among the most viewed in the United States. With some 44.2 million accounts seeing the link in April, May and June, it was about half as popular as Trending World in the shelved report.

Sheera Frenkel and Mike Isaac contributed reporting. Jacob Silver and Ben Decker contributed research.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<