Nick Clegg, who leads public affairs, to explain the company’s role in removing content tied to the Israeli-Palestinian conflict, according to attendees. The employee called the situation in Israel “fraught” and asked how Facebook was going “to get it right” with content moderation.

Mr. Clegg ran through a list of policy rules and plans going forward, and assured staff that moderation would be treated with fairness and responsibility, two people familiar with the meeting said. The discussion was cordial, one of the people said, and comments in the chat box beside Mr. Clegg’s response were largely positive.

But some employees were dissatisfied, the people said. As Mr. Clegg spoke, they broke off into private chats and workplace groups, known as Tribes, to discuss what to do.

Dozens of employees later formed a group to flag the Palestinian content that they said had been suppressed to internal content moderation teams, said two employees. The goal was to have the posts reinstated online, they said.

Members of Facebook’s policy team have tried calming the tensions. In an internal memo in mid-May, which was reviewed by The Times, two policy team members wrote to other employees that they hoped “that Facebook’s internal community will resist succumbing to the division and demonization of the other side that is so brutally playing itself out offline and online.”

One of them was Muslim, and the other was Jewish, they said.

“We don’t always agree,” they wrote. “However, we do some of our best work when we assume good intent and recognize that we are on the same side trying to serve our community in the best possible way.”

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Dozens of state prosecutors tell Facebook to stop its plans for a children’s version of Instagram.

Attorneys general for 44 states and jurisdictions called on Facebook to halt plans to create a version of Instagram for young children, citing concerns over mental and emotional well-being, exposure to online predators and cyberbullying.

In a letter on Monday to Facebook’s chief executive, Mark Zuckerberg, the prosecutors warned that social media can be harmful to children and that the company had a poor record of protecting children online. Facebook, which bought the photo-sharing app Instagram in 2012, currently has a minimum age requirement of 13 to use its products. According to federal children’s privacy rules, companies must ask parents for permission to collect data on users younger than 13.

The law enforcement officials pointed to research showing how the use of social media, including Instagram, has led to an increase in mental distress, body image concerns and even suicidal thoughts. A children’s version of Instagram doesn’t fill a need beyond the company’s commercial ambitions, the officials said in the letter.

“Without a doubt, this is a dangerous idea that risks the safety of our children and puts them directly in harm’s way,” Letitia James, New York’s attorney general, said in a statement. “There are too many concerns to let Facebook move forward with this ill-conceived idea, which is why we are calling on the company to abandon its launch of Instagram Kids.”

Facebook defended its plans, saying its development of a children’s version of Instagram would have safety and privacy in mind. It wouldn’t show ads on the app, the company vowed.

“As every parent knows, kids are already online,” Andy Stone, a Facebook spokesman, said in a statement. “We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing.”

View Source

Facebook Oversight Board Tells Zuckerberg He’s the Decider on Trump

When Mr. Zuckerberg first pitched the idea of a “Facebook Supreme Court” several years ago, he promoted it as a way to make the company’s governance more democratic, by forming an independent body of subject matter experts and giving them the power to hear appeals from users.

“I think in any kind of good-functioning democratic system, there needs to be a way to appeal,” Mr. Zuckerberg told Ezra Klein in a 2018 Vox podcast.

The oversight board also served another purpose. For years, Mr. Zuckerberg had been called in as Facebook’s policy judge of last resort. (In 2018, for example, he got personally involved in the decision to bar Alex Jones, the Infowars conspiracy theorist.) But high-profile moderation decisions were often unpopular, and the blowback was often fierce. If it worked, the oversight board would take responsibility for making the platform’s most contentious content decisions, while shielding Mr. Zuckerberg and his policy team from criticism.

It’s hard to imagine a dispute Mr. Zuckerberg would be more eager to avoid than the one about Mr. Trump. The former president rode Facebook to the White House in 2016, then tormented the company by repeatedly skirting its rules and daring executives to punish him for it. When they finally did, Republicans raged at Mr. Zuckerberg and his lieutenants, accusing them of politically motivated censorship.

Facebook faced plenty of pressure in the other direction, too — both from Democrats and civil rights groups and from employees, many of whom saw Mr. Trump’s presence on Facebook as fundamentally incompatible with their goal of reducing harmful misinformation and hate speech. No matter what Mr. Zuckerberg and his team decided, they were sure to inflame the online speech wars and make more enemies.

Before the decision on Wednesday, Mr. Zuckerberg and other Facebook executives did everything they could to convince a skeptical public that the oversight board would have real teeth. They funded the group through a legally independent trust, filled it with hyper-credentialed experts and pledged to abide by its rulings.

But for all its claims of legitimacy, the oversight board has always had a Potemkin quality to it. Its leaders were selected by Facebook, and its members are (handsomely) paid out of the company’s pockets. Its mandate is limited, and none of its rulings are binding, in any meaningful sense of that word. If Mr. Zuckerberg decided tomorrow to ignore the board’s advice and reinstate Mr. Trump’s accounts, nothing — no act of Congress, no judicial writ, no angry letter from Facebook shareholders — could stop him.

View Source

Nick Clegg Steers Facebook’s Trump Decision

Facebook wanted Mr. Clegg to help repair its relationships with regulators, political leaders and the media after the Cambridge Analytica scandal, when data improperly pulled from Facebook was used to create voter profiles. Mr. Clegg’s international experience and comfort in five languages — English, Spanish, French, German and Dutch — appealed to the American-centric company.

Friends said Mr. Clegg had initially been reluctant to join Facebook, one of the world’s most polarizing corporations. But he wanted to be back at the center of important political and policy debates. In a memo outlining how he envisioned the role, he argued that it was unsustainable for a private company like Facebook, rather than democratically elected governments, to have so much power, especially on speech-related issues.

“My advice was strongly to go for it,” said Tony Blair, the former British prime minister, whom Mr. Clegg spoke with before taking the job, “because you’re going to be part of one of the most powerful companies in the world at a moment of enormous change in the world, and when technology is at the heart of that change.”

Inside Facebook, where Mr. Zuckerberg leans on a group of friends and early employees for counsel, Mr. Clegg earned the trust of his new boss. At the company’s headquarters, where proximity to Mr. Zuckerberg is power, Mr. Clegg’s desk was placed nearby. He orchestrated a trip through Europe with Mr. Zuckerberg, meeting with European Union leaders in Brussels and President Emmanuel Macron of France in Paris.

Since Mr. Clegg’s arrival, Facebook has shifted some of its policy positions. It now appears more accepting of regulation and higher taxes. He overcame reluctance from Mr. Zuckerberg and others in the company to ban political ads in the weeks before Election Day last year. And he was the main internal supporter for recently announced product changes that give users more control over what posts they see in their Facebook feeds.

“He has a track record of knowing what it’s like to work inside a cabinet that needs to make decisions quickly and move at the speed of a country, or in this case a platform,” said Chris Cox, Facebook’s chief product officer, who worked with Mr. Clegg on the user-control changes.

View Source

Facebook’s Ban of Trump Upheld by Oversight Board

SAN FRANCISCO — A Facebook-appointed panel of journalists, activists and lawyers ruled on Wednesday to uphold the social network’s ban of former President Donald J. Trump, ending any immediate return by Mr. Trump to mainstream social media and renewing a debate about tech power over online speech.

Facebook’s Oversight Board, which acts as a quasi-court to deliberate the company’s content decisions, said the social network was right to bar Mr. Trump after he used the site to foment an insurrection in Washington in January. The panel said the ongoing risk of violence “justified” the suspension.

But the board also said that Facebook’s penalty of an indefinite suspension was “not appropriate,” and that the company should apply a “defined penalty.” The board gave Facebook six months to make its final decision on Mr. Trump’s account status.

“Our sole job is to hold this extremely powerful organization, Facebook, to be held accountable,” Michael McConnell, co-chair of the Oversight Board, said on a call with reporters. The decision “did not meet these standards,” he said.

Twitter and YouTube had also cut off Mr. Trump in January after the insurrection at the Capitol building, saying the risk of harm and the potential for violence that he created was too great.

But while Mr. Trump’s Facebook account remains suspended for now, it does not mean that he will not be able to return to the social network at all once the company reviews its action. On Tuesday, Mr. Trump had unveiled a new site, “From the desk of Donald J. Trump,” to communicate with his supporters. It looked much like a Twitter feed, complete with posts written by Mr. Trump that could be shared on Facebook, Twitter and YouTube.

Mr. Trump’s continuing suspension from Facebook gave conservatives, who have long accused the social media companies of suppressing right-wing voices, new fuel against the platforms. Mark Zuckerberg, Facebook’s chief executive, has testified in Congress several times in recent years about whether the social network has shown bias against conservative political views. He has denied it.

In a tweet, the Republican members of the House judiciary committee said of the board’s decision, “Pathetic.”

Mr. Zuckerberg has said that he does not wish his company to be “the arbiter of truth” in social discourse, Facebook has become increasingly active about the kinds of content it allows. To prevent the spread of misinformation, the company has cracked down on QAnon conspiracy theory groups, election falsehoods and anti-vaccination content in recent months, before culminating in the blocking of Mr. Trump in January.

“This case has dramatic implications for the future of speech online because the public and other platforms are looking at how the oversight board will handle what is a difficult controversy that will arise again around the world,” said Nate Persily, a professor at Stanford University’s law school.

He added, “President Trump has pushed the envelope about what is permissible speech on these platforms and he has set the outer limits such that if you are unwilling to go after him, you are allowing a large amount of incitement and hate speech and disinformation online that others are going to propagate.”

In a statement, Facebook said it was “pleased” that the board recognized that its barring of Mr. Trump in January was justified. The company added that it would consider the ruling and “determine an action that is clear and proportionate.”

Mr. Trump’s case is the most prominent that the Facebook Oversight Board, which was conceived in 2018, has handled. The board, which is made up of 20 journalists, activists and former politicians, reviews and adjudicates the company’s most contested content moderation decisions. Mr. Zuckerberg has repeatedly referred to it as the “Facebook Supreme Court.”

But while the panel is positioned as independent, it was founded and funded by Facebook and has no legal or enforcement authority. Critics have been skeptical of the board’s autonomy and have said it gives Facebook the ability to punt on difficult decisions.

revoke Section 230, a legal shield that protects companies like Facebook from liability for what users post.

privately with Mr. Trump.

The politeness ended on Jan. 6. Hours before his supporters stormed the Capitol, Mr. Trump used Facebook and other social media to try to cast doubt on the results of the presidential election, which he had lost to Joseph R. Biden Jr. Mr. Trump wrote on Facebook, “Our Country has had enough, they won’t take it anymore!”

Less than 24 hours later, Mr. Trump was barred from the platform indefinitely. While his Facebook page has remained up, it has been dormant. His last Facebook post, on Jan. 6, read, “I am asking for everyone at the U.S. Capitol to remain peaceful. No violence!”

Cecilia Kang contributed reporting from Washington.

View Source

What Is the Facebook Oversight Board?

An independent panel called the Facebook Oversight Board on Wednesday upheld Facebook’s ban on former President Donald J. Trump, but said the company must review its decision to impose an indefinite suspension.

The company suspended Mr. Trump’s account on Jan. 7, after he used social media accounts to incite a mob of supporters to attack the Capitol a day earlier. The board gave Facebook six months to determine its final decision on Mr. Trump’s account status.

Here are key facts to know about the Facebook Oversight Board and its decision:

The board is a panel of about 20 former political leaders, human rights activists and journalists picked by Facebook to deliberate the company’s content decisions. It began a year ago and is based in London.

Facebook’s chief executive, Mark Zuckerberg, conceived the idea of having an independent body that acted like a Supreme Court in 2018. The idea was for the public to have a way to appeal decisions by Facebook to remove content that violates its policies against harmful and hateful posts. Mr. Zuckerberg said neither he nor the company wanted to have the final decision on speech.

The company and paid members of the panel stress that the board is independent. But Facebook funds the board with a $130 million trust and top executives played a big role in its formation.

So far the board has issued a handful of decisions on minor takedowns by Facebook. The majority of the rulings have overturned Facebook’s decisions.

Two weeks after Facebook decided to temporarily lock the account of Mr. Trump, the company said it would refer the case to the Oversight Board, effectively punting to outsiders a final decision on the former president.

In a blog post, the company explained that executives had blocked Mr. Trump’s account because he had violated the company’s policies against the incitement of violence and that the deadly storming of the Capitol defied the company’s belief in a peaceful transition of government and the democratic process.

“We look forward to receiving the board’s decision — and we hope, given the clear justification for our actions on Jan. 7, that it will uphold the choices we made,” Nick Clegg, Facebook’s vice president for global affairs, said in the post.

The board will tell Facebook to remove the ban, or to keep it. The ruling may also come with more nuance. The board could say that the ban was appropriate at the time but is no longer necessary, or that the ban was the wrong decision from the start.

The company then has seven days to put the board’s ruling into effect.

The board takes cases that are referred by Facebook or the public. The panel then selects five members to first deliberate on each case, with one based in the home country represented by the case.

The members meet to discuss the case and vet public comments. More than 9,000 comments were submitted on Trump’s account. The board extended its 90-day deadline on decisions for the Trump case because of the high volume of public comments. The board will base its decision on two main criteria: if Facebook’s ban on Mr. Trump followed the company’s community standards and adhered to human rights laws. When the smaller panel of board members reaches a majority, the decision is taken to the full board for a vote.

In the Trump case, Facebook also asked the board to give policy recommendations on how to handle the accounts of political leaders. The company doesn’t have to adopt the recommendations.

If Facebook follows its own rules, then yes. The company has said that all decisions from the oversight board are binding, and that even Mr. Zuckerberg couldn’t overturn the rulings. (Mr. Trump was also barred, permanently, from Twitter, where he had some 88 million followers.)

But there is no body that enforces this agreement between Facebook and the board. Facebook has rejected one recommendation by the board that dealt with the takedown of a Covid-19 post. The company says recommendations are different from rulings and are not binding.

View Source

Facebook panel will reveal on Wednesday whether Trump will regain his megaphone.

Facebook’s Oversight Board, an independent and international panel that was created and funded by the social network, plans to announce on Wednesday whether former President Donald J. Trump will be able to return to the platform that has been a critical megaphone for him and his tens of millions of followers.

The decision will be closely watched as a template for how private companies that run social networks handle political speech, including the misinformation spread by political leaders.

Mr. Trump was indefinitely locked out of Facebook on Jan. 7 after he used his social media accounts to incite a mob of his supporters to storm the Capitol a day earlier. Mr. Trump had declined to accept his election defeat, saying the election had been stolen from him.

At the time that Facebook barred Mr. Trump, the company’s chief executive, Mark Zuckerberg, wrote in a post: “We believe the risks of allowing the president to continue to use our service during this period are simply too great.”

the company referred the case of Mr. Trump to Facebook’s Oversight Board for a final decision on whether the ban should be permanent. Facebook and the board’s members have said the panel’s decisions are binding, but critics are skeptical of the board’s independence. The panel, critics said, is a first-of-its-kind Supreme Court-like entity on online speech, funded by a private company with a poor track record of enforcing its own rules.

Facebook’s approach to political speech has been inconsistent. In October 2019, Mr. Zuckerberg declared the company would not fact check political speech and said that even lies by politicians deserved a place on the social network because it was in the public’s interest to hear all ideas by political leaders. But Mr. Trump’s comments on Jan. 6 were different, the company has said, because they incited violence and threatened the peaceful transition of power in elections.

On Monday, Mr. Trump continued to deny the election results.

“The Fraudulent Presidential Election of 2020 will be, from this day forth, known as THE BIG LIE!,” he said in an emailed statement.

View Source

Facebook nearly doubles its profit and revenue rises 48 percent, as tech booms.

Facebook said on Wednesday that revenue rose 48 percent to $26.2 billion in the first three months of the year, while profits nearly doubled to $9.5 billion, underlining how the social network has continued to benefit during the pandemic.

Advertising revenue, which makes up the bulk of Facebook’s income, rose 46 percent to $25.4 billion. Nearly 3.5 billion people now use one of Facebook’s apps every month, up 15 percent from a year earlier.

The results followed a blockbuster financial performance in 2020, as the pandemic pushed people indoors toward their computers and other devices — and onto the social network and its associated apps like Instagram, WhatsApp and Messenger — in ever-increasing numbers. Facebook recorded highs in users and revenues and its services were in such demand that engineers at times struggled to “keep the lights on.”

Yet Wall Street is now expected to scrutinize Facebook’s advertising business closely. On Monday, Apple rolled out an update of its mobile software with a new feature that asks people if they wish to opt out of being tracked by advertisers outside of apps like Facebook. If people choose not to be tracked, that could hurt Facebook’s business, which relies on user data to target advertising.

Facebook cut off Mr. Trump from the platform after the riot, though a final decision about whether to keep him off the site indefinitely has not been made.

View Source