An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.

Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.”

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Here’s a Look Inside Facebook’s Data Wars

“Reach leaderboard isn’t a total win from a comms point of view,” Mr. Silverman wrote.

Mr. Schultz, Facebook’s chief marketing officer, had the dimmest view of CrowdTangle. He wrote that he thought “the only way to avoid stories like this” would be for Facebook to publish its own reports about the most popular content on its platform, rather than releasing data through CrowdTangle.

“If we go down the route of just offering more self-service data you will get different, exciting, negative stories in my opinion,” he wrote.

Mr. Osborne, the Facebook spokesman, said Mr. Schultz and the other executives were discussing how to correct misrepresentations of CrowdTangle data, not strategizing about killing off the tool.

A few days after the election in November, Mr. Schultz wrote a post for the company blog, called “What Do People Actually See on Facebook in the U.S.?” He explained that if you ranked Facebook posts based on which got the most reach, rather than the most engagement — his preferred method of slicing the data — you’d end up with a more mainstream, less sharply partisan list of sources.

“We believe this paints a more complete picture than the CrowdTangle data alone,” he wrote.

That may be true, but there’s a problem with reach data: Most of it is inaccessible and can’t be vetted or fact-checked by outsiders. We simply have to trust that Facebook’s own, private data tells a story that’s very different from the data it shares with the public.

Mr. Zuckerberg is right about one thing: Facebook is not a giant right-wing echo chamber.

But it does contain a giant right-wing echo chamber — a kind of AM talk radio built into the heart of Facebook’s news ecosystem, with a hyper-engaged audience of loyal partisans who love liking, sharing and clicking on posts from right-wing pages, many of which have gotten good at serving up Facebook-optimized outrage bait at a consistent clip.

CrowdTangle’s data made this echo chamber easier for outsiders to see and quantify. But it didn’t create it, or give it the tools it needed to grow — Facebook did — and blaming a data tool for these revelations makes no more sense than blaming a thermometer for bad weather.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook’s Trump Ban Will Last at Least 2 Years

In an emailed statement, Mr. Trump said Facebook’s ruling was “an insult to the record-setting 75M people, plus many others, who voted for us in the 2020 Rigged Presidential Election.” He added that Facebook should not be allowed to get away with “censoring and silencing” him and others on the platform.

Facebook’s broader shift to no longer automatically exempt speech by politicians from its rules is a stark reversal from a free-speech position that Mark Zuckerberg, the company’s chief executive, had championed. In a 2019 address at Georgetown University, Mr. Zuckerberg said, “People having the power to express themselves at scale is a new kind of force in the world — a Fifth Estate alongside the other power structures of society.”

But that stance drew criticism from lawmakers, activists and Facebook’s own employees, who said the company allowed misinformation and other harmful speech from politicians to flow unhindered.

While many academics and activists welcomed Facebook’s changes on Friday as a step in the right direction, they said the implementation of the new rules would be tricky. The company would likely enter into a complicated dance with global leaders who had grown accustomed to receiving special treatment by the platform, they said.

“This change will result in speech by world leaders being subject to more scrutiny,” said David Kaye, a law professor and former United Nations monitor for freedom of expression. “It will be painful for leaders who aren’t used to the scrutiny, and it will also lead to tensions.”

Countries including India, Turkey and Egypt have threatened to take action against Facebook if it acts against the interests of the ruling parties, Mr. Kaye said. The countries have said they might punish Facebook’s local staff or ban access to the service, he said.

“This decision by Facebook imposes new political calculations for both these global leaders, and for Facebook,” Mr. Kaye said.

This is a developing story. Check back for updates.

Maggie Haberman contributed reporting.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

India and Israel Inflame Facebook’s Fights With Its Own Employees

SAN FRANCISCO — When India’s government ordered Facebook and other tech companies to take down posts critical of its handling of the coronavirus pandemic in April, the social network complied on some posts.

But once it did, its employees flocked to online chat rooms to ask why Facebook had helped Prime Minister Narendra Modi of India stifle dissent. In one internal post, which was reviewed by The New York Times, an employee with family in India accused Facebook of “being afraid” that Mr. Modi would ban the company from doing business in the country. “We can’t act or make decisions out of fear,” he wrote.

Weeks later, when clashes broke out in Israel between Israelis and Palestinians, Facebook removed posts from prominent Palestinian activists and briefly banned hashtags related to the violence. Facebook employees again took to the message boards to ask why their company now appeared to be censoring pro-Palestinian content.

“It just feels like, once again, we are erring on the side of a populist government and making decisions due to politics, not policies,” one worker wrote in an internal message that was reviewed by The Times.

inflammatory posts from former President Donald J. Trump. But since Mr. Trump left office in January, attention has shifted to Facebook’s global policies and what employees said was the company’s acquiescence to governments so that it could continue profiting in those countries.

“There’s a feeling among people at Facebook that this is a systematic approach, one which favors strong government leaders over the principles of doing what is right and correct,” said Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa region, who left in 2017.

Facebook is increasingly caught in a vise. In India, Russia and elsewhere, governments are pressuring it to remove content as they try to corral the platform’s power over online speech. But when Facebook complies with the takedown orders, it has upset its own employees, who say the social network has helped authoritarian leaders and repressive regimes quash activists and silence marginalized communities.

BuzzFeed News and the Financial Times earlier reported on some of the employee dissatisfaction at Facebook over Israeli and Palestinian content.

A divide between Facebook’s employees and the global policy team, which is composed of roughly 1,000 employees, has existed for years, current and former workers said. The policy team reports to Sheryl Sandberg, the chief operating officer.

many tricky international situations over the years, including in Russia, Vietnam and Myanmar, where it has had to consider whether it would be shut down if it did not work with governments. That has led to the employee dissent, which has begun spilling into public view.

That became evident with India. In April, as Covid-19 cases soared in the country, Mr. Modi’s government called for roughly 100 social media posts on Facebook, Instagram and Twitter to be pulled down. Many of the posts included critiques of the government from opposition politicians and calls for Mr. Modi’s resignation.

Facebook removed some of the posts and briefly blocked a hashtag, #ResignModi. The company later said the hashtag had been banned by mistake and was not part of a government request.

But internally, the damage was done. In online chat rooms dedicated to human rights issues and global policy, employees described how disappointed they were with Facebook’s actions. Some shared stories of family members in India who were worried they were being censored.

Last month, when violence broke out between Israelis and Palestinians, reports surfaced that Facebook had erased content from Palestinian activists. Facebook’s Instagram app also briefly banned the #AlAqsa hashtag, a reference to Al Aqsa Mosque, one of Islam’s holiest sites. Facebook later explained that it had confused the #AlAqsa hashtag with a Palestinian militant group called Al Aqsa Martyrs Brigade.

Employees bristled. “We are responding to people’s protests about censoring with more censoring?” one wrote in an internal message, which was reviewed by The Times.

Nick Clegg, who leads public affairs, to explain the company’s role in removing content tied to the Israeli-Palestinian conflict, according to attendees. The employee called the situation in Israel “fraught” and asked how Facebook was going “to get it right” with content moderation.

Mr. Clegg ran through a list of policy rules and plans going forward, and assured staff that moderation would be treated with fairness and responsibility, two people familiar with the meeting said. The discussion was cordial, one of the people said, and comments in the chat box beside Mr. Clegg’s response were largely positive.

But some employees were dissatisfied, the people said. As Mr. Clegg spoke, they broke off into private chats and workplace groups, known as Tribes, to discuss what to do.

Dozens of employees later formed a group to flag the Palestinian content that they said had been suppressed to internal content moderation teams, said two employees. The goal was to have the posts reinstated online, they said.

Members of Facebook’s policy team have tried calming the tensions. In an internal memo in mid-May, which was reviewed by The Times, two policy team members wrote to other employees that they hoped “that Facebook’s internal community will resist succumbing to the division and demonization of the other side that is so brutally playing itself out offline and online.”

One of them was Muslim, and the other was Jewish, they said.

“We don’t always agree,” they wrote. “However, we do some of our best work when we assume good intent and recognize that we are on the same side trying to serve our community in the best possible way.”

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Dozens of state prosecutors tell Facebook to stop its plans for a children’s version of Instagram.

Attorneys general for 44 states and jurisdictions called on Facebook to halt plans to create a version of Instagram for young children, citing concerns over mental and emotional well-being, exposure to online predators and cyberbullying.

In a letter on Monday to Facebook’s chief executive, Mark Zuckerberg, the prosecutors warned that social media can be harmful to children and that the company had a poor record of protecting children online. Facebook, which bought the photo-sharing app Instagram in 2012, currently has a minimum age requirement of 13 to use its products. According to federal children’s privacy rules, companies must ask parents for permission to collect data on users younger than 13.

The law enforcement officials pointed to research showing how the use of social media, including Instagram, has led to an increase in mental distress, body image concerns and even suicidal thoughts. A children’s version of Instagram doesn’t fill a need beyond the company’s commercial ambitions, the officials said in the letter.

“Without a doubt, this is a dangerous idea that risks the safety of our children and puts them directly in harm’s way,” Letitia James, New York’s attorney general, said in a statement. “There are too many concerns to let Facebook move forward with this ill-conceived idea, which is why we are calling on the company to abandon its launch of Instagram Kids.”

Facebook defended its plans, saying its development of a children’s version of Instagram would have safety and privacy in mind. It wouldn’t show ads on the app, the company vowed.

“As every parent knows, kids are already online,” Andy Stone, a Facebook spokesman, said in a statement. “We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing.”

View Source

Facebook Oversight Board Tells Zuckerberg He’s the Decider on Trump

When Mr. Zuckerberg first pitched the idea of a “Facebook Supreme Court” several years ago, he promoted it as a way to make the company’s governance more democratic, by forming an independent body of subject matter experts and giving them the power to hear appeals from users.

“I think in any kind of good-functioning democratic system, there needs to be a way to appeal,” Mr. Zuckerberg told Ezra Klein in a 2018 Vox podcast.

The oversight board also served another purpose. For years, Mr. Zuckerberg had been called in as Facebook’s policy judge of last resort. (In 2018, for example, he got personally involved in the decision to bar Alex Jones, the Infowars conspiracy theorist.) But high-profile moderation decisions were often unpopular, and the blowback was often fierce. If it worked, the oversight board would take responsibility for making the platform’s most contentious content decisions, while shielding Mr. Zuckerberg and his policy team from criticism.

It’s hard to imagine a dispute Mr. Zuckerberg would be more eager to avoid than the one about Mr. Trump. The former president rode Facebook to the White House in 2016, then tormented the company by repeatedly skirting its rules and daring executives to punish him for it. When they finally did, Republicans raged at Mr. Zuckerberg and his lieutenants, accusing them of politically motivated censorship.

Facebook faced plenty of pressure in the other direction, too — both from Democrats and civil rights groups and from employees, many of whom saw Mr. Trump’s presence on Facebook as fundamentally incompatible with their goal of reducing harmful misinformation and hate speech. No matter what Mr. Zuckerberg and his team decided, they were sure to inflame the online speech wars and make more enemies.

Before the decision on Wednesday, Mr. Zuckerberg and other Facebook executives did everything they could to convince a skeptical public that the oversight board would have real teeth. They funded the group through a legally independent trust, filled it with hyper-credentialed experts and pledged to abide by its rulings.

But for all its claims of legitimacy, the oversight board has always had a Potemkin quality to it. Its leaders were selected by Facebook, and its members are (handsomely) paid out of the company’s pockets. Its mandate is limited, and none of its rulings are binding, in any meaningful sense of that word. If Mr. Zuckerberg decided tomorrow to ignore the board’s advice and reinstate Mr. Trump’s accounts, nothing — no act of Congress, no judicial writ, no angry letter from Facebook shareholders — could stop him.

View Source