John Tye, the founder of Whistleblower Aid, a legal nonprofit that represents people seeking to expose potential lawbreaking, was contacted this spring through a mutual connection by a woman who claimed to have worked at Facebook.
The woman told Mr. Tye and his team something intriguing: She had access to tens of thousands of pages of internal documents from the world’s largest social network. In a series of calls, she asked for legal protection and a path to releasing the confidential information. Mr. Tye, who said he understood the gravity of what the woman brought “within a few minutes,” agreed to represent her and call her by the alias “Sean.”
She “is a very courageous person and is taking a personal risk to hold a trillion-dollar company accountable,” he said.
On Sunday, Frances Haugen revealed herself to be “Sean,” the whistle-blower against Facebook. A product manager who worked for nearly two years on the civic misinformation team at the social network before leaving in May, Ms. Haugen has used the documents she amassed to expose how much Facebook knew about the harms that it was causing and provided the evidence to lawmakers, regulators and the news media.
knew Instagram was worsening body image issues among teenagers and that it had a two-tier justice system — have spurred criticism from lawmakers, regulators and the public.
Ms. Haugen has also filed a whistle-blower complaint with the Securities and Exchange Commission, accusing Facebook of misleading investors with public statements that did not match its internal actions. And she has talked with lawmakers such as Senator Richard Blumenthal, a Democrat of Connecticut, and Senator Marsha Blackburn, a Republican of Tennessee, and shared subsets of the documents with them.
The spotlight on Ms. Haugen is set to grow brighter. On Tuesday, she is scheduled to testify in Congress about Facebook’s impact on young users.
misinformation and hate speech.
In 2018, Christopher Wylie, a disgruntled former employee of the consulting firm Cambridge Analytica, set the stage for those leaks. Mr. Wylie spoke with The New York Times, The Observer of London and The Guardian to reveal that Cambridge Analytica had improperly harvested Facebook data to build voter profiles without users’ consent.
In the aftermath, more of Facebook’s own employees started speaking up. Later that same year, Facebook workers provided executive memos and planning documents to news outlets including The Times and BuzzFeed News. In mid-2020, employees who disagreed with Facebook’s decision to leave up a controversial post from President Donald J. Trump staged a virtual walkout and sent more internal information to news outlets.
“I think over the last year, there’ve been more leaks than I think all of us would have wanted,” Mark Zuckerberg, Facebook’s chief executive, said in a meeting with employees in June 2020.
Facebook tried to preemptively push back against Ms. Haugen. On Friday, Nick Clegg, Facebook’s vice president for policy and global affairs, sent employees a 1,500-word memo laying out what the whistle-blower was likely to say on “60 Minutes” and calling the accusations “misleading.” On Sunday, Mr. Clegg appeared on CNN to defend the company, saying the platform reflected “the good, the bad and ugly of humanity” and that it was trying to “mitigate the bad, reduce it and amplify the good.”
personal website. On the website, Ms. Haugen was described as “an advocate for public oversight of social media.”
A native of Iowa City, Iowa, Ms. Haugen studied electrical and computer engineering at Olin College and got an M.B.A. from Harvard, the website said. She then worked on algorithms at Google, Pinterest and Yelp. In June 2019, she joined Facebook. There, she handled democracy and misinformation issues, as well as working on counterespionage, according to the website.
filed an antitrust suit against Facebook. In a video posted by Whistleblower Aid on Sunday, Ms. Haugen said she did not believe breaking up Facebook would solve the problems inherent at the company.
“The path forward is about transparency and governance,” she said in the video. “It’s not about breaking up Facebook.”
Ms. Haugen has also spoken to lawmakers in France and Britain, as well as a member of European Parliament. This month, she is scheduled to appear before a British parliamentary committee. That will be followed by stops at Web Summit, a technology conference in Lisbon, and in Brussels to meet with European policymakers in November, Mr. Tye said.
On Sunday, a GoFundMe page that Whistleblower Aid created for Ms. Haugen also went live. Noting that Facebook had “limitless resources and an army of lawyers,” the group set a goal of raising $10,000. Within 30 minutes, 18 donors had given $1,195. Shortly afterward, the fund-raising goal was increased to $50,000.
The changes have involved Facebook executives from its marketing, communications, policy and integrity teams. Alex Schultz, a 14-year company veteran who was named chief marketing officer last year, has also been influential in the image reshaping effort, said five people who worked with him. But at least one of the decisions was driven by Mr. Zuckerberg, and all were approved by him, three of the people said.
Joe Osborne, a Facebook spokesman, denied that the company had changed its approach.
“People deserve to know the steps we’re taking to address the different issues facing our company — and we’re going to share those steps widely,” he said in a statement.
For years, Facebook executives have chafed at how their company appeared to receive more scrutiny than Google and Twitter, said current and former employees. They attributed that attention to Facebook’s leaving itself more exposed with its apologies and providing access to internal data, the people said.
So in January, executives held a virtual meeting and broached the idea of a more aggressive defense, one attendee said. The group discussed using the News Feed to promote positive news about the company, as well as running ads that linked to favorable articles about Facebook. They also debated how to define a pro-Facebook story, two participants said.
That same month, the communications team discussed ways for executives to be less conciliatory when responding to crises and decided there would be less apologizing, said two people with knowledge of the plan.
Mr. Zuckerberg, who had become intertwined with policy issues including the 2020 election, also wanted to recast himself as an innovator, the people said. In January, the communications team circulated a document with a strategy for distancing Mr. Zuckerberg from scandals, partly by focusing his Facebook posts and media appearances on new products, they said.
The Information, a tech news site, previously reported on the document.
The impact was immediate. On Jan. 11, Sheryl Sandberg, Facebook’s chief operating officer — and not Mr. Zuckerberg — told Reuters that the storming of the U.S. Capitol a week earlier had little to do with Facebook. In July, when President Biden said the social network was “killing people” by spreading Covid-19 misinformation, Guy Rosen, Facebook’s vice president for integrity, disputed the characterization in a blog post and pointed out that the White House had missed its coronavirus vaccination goals.
Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.
The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.
Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.
Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.
pays them through a trust.
The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.
In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.
A spokesman for the Oversight Board declined to comment.
Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.
bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.
The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.
The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.
Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.
There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.
“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”
Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter.
An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.
Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.”
When Facebook this week released its first quarterly report about the most viewed posts in the United States, Guy Rosen, its vice president of integrity, said the social network had undertaken “a long journey” to be “by far the most transparent platform on the internet.” The list showed that the posts with the most reach tended to be innocuous content like recipes and cute animals.
Facebook had prepared a similar report for the first three months of the year, but executives never shared it with the public because of concerns that it would look bad for the company, according to internal emails sent by executives and shared with The New York Times.
In that report, a copy of which was provided to The Times, the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.
The report was nearing public release when some executives, including Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, debated whether it would cause a public relations problem, according to the internal emails. The company decided to shelve it.
called on the company to share more information about false and misleading information on the site, and to do a better job of stopping its spread. Last month, President Biden accused the company of “killing people” by allowing false information to circulate widely, a statement the White House later softened. Other federal agencies have accused Facebook of withholding key data.
Facebook has pushed back, publicly accusing the White House of scapegoating the company for the administration’s failure to reach its vaccination goals. Executives at Facebook, including Mark Zuckerberg, its chief executive, have said the platform has been aggressively removing Covid-19 misinformation since the start of the pandemic. The company said it had removed over 18 million pieces of misinformation in that period.
But Brian Boland, a former vice president of product marketing at Facebook, said there was plenty of reason to be skeptical about data collected and released by a company that has had a history of protecting its own interests.
barred from advertising on Facebook because of its repeated violations of the platform’s political advertising policy.
Trending World, according to the report, was viewed by 81.4 million accounts, slightly fewer than the 18th-most-popular page, Fox News, which had 81.7 million content viewers for the first three months of 2021.
Facebook’s transparency report released on Wednesday also showed that an Epoch Times subscription link was among the most viewed in the United States. With some 44.2 million accounts seeing the link in April, May and June, it was about half as popular as Trending World in the shelved report.
Sheera Frenkel and Mike Isaac contributed reporting. Jacob Silver and Ben Decker contributed research.
In an emailed statement, Mr. Trump said Facebook’s ruling was “an insult to the record-setting 75M people, plus many others, who voted for us in the 2020 Rigged Presidential Election.” He added that Facebook should not be allowed to get away with “censoring and silencing” him and others on the platform.
Facebook’s broader shift to no longer automatically exempt speech by politicians from its rules is a stark reversal from a free-speech position that Mark Zuckerberg, the company’s chief executive, had championed. In a 2019 address at Georgetown University, Mr. Zuckerberg said, “People having the power to express themselves at scale is a new kind of force in the world — a Fifth Estate alongside the other power structures of society.”
But that stance drew criticism from lawmakers, activists and Facebook’s own employees, who said the company allowed misinformation and other harmful speech from politicians to flow unhindered.
While many academics and activists welcomed Facebook’s changes on Friday as a step in the right direction, they said the implementation of the new rules would be tricky. The company would likely enter into a complicated dance with global leaders who had grown accustomed to receiving special treatment by the platform, they said.
“This change will result in speech by world leaders being subject to more scrutiny,” said David Kaye, a law professor and former United Nations monitor for freedom of expression. “It will be painful for leaders who aren’t used to the scrutiny, and it will also lead to tensions.”
Countries including India, Turkey and Egypt have threatened to take action against Facebook if it acts against the interests of the ruling parties, Mr. Kaye said. The countries have said they might punish Facebook’s local staff or ban access to the service, he said.
“This decision by Facebook imposes new political calculations for both these global leaders, and for Facebook,” Mr. Kaye said.
This is a developing story. Check back for updates.
SAN FRANCISCO — When India’s government ordered Facebook and other tech companies to take down posts critical of its handling of the coronavirus pandemic in April, the social network complied on some posts.
But once it did, its employees flocked to online chat rooms to ask why Facebook had helped Prime Minister Narendra Modi of India stifle dissent. In one internal post, which was reviewed by The New York Times, an employee with family in India accused Facebook of “being afraid” that Mr. Modi would ban the company from doing business in the country. “We can’t act or make decisions out of fear,” he wrote.
Weeks later, when clashes broke out in Israel between Israelis and Palestinians, Facebook removed posts from prominent Palestinian activists and briefly banned hashtags related to the violence. Facebook employees again took to the message boards to ask why their company now appeared to be censoring pro-Palestinian content.
“It just feels like, once again, we are erring on the side of a populist government and making decisions due to politics, not policies,” one worker wrote in an internal message that was reviewed by The Times.
inflammatory posts from former President Donald J. Trump. But since Mr. Trump left office in January, attention has shifted to Facebook’s global policies and what employees said was the company’s acquiescence to governments so that it could continue profiting in those countries.
“There’s a feeling among people at Facebook that this is a systematic approach, one which favors strong government leaders over the principles of doing what is right and correct,” said Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa region, who left in 2017.
Facebook is increasingly caught in a vise. In India, Russia and elsewhere, governments are pressuring it to remove content as they try to corral the platform’s power over online speech. But when Facebook complies with the takedown orders, it has upset its own employees, who say the social network has helped authoritarian leaders and repressive regimes quash activists and silence marginalized communities.
BuzzFeed News and the Financial Times earlier reported on some of the employee dissatisfaction at Facebook over Israeli and Palestinian content.
A divide between Facebook’s employees and the global policy team, which is composed of roughly 1,000 employees, has existed for years, current and former workers said. The policy team reports to Sheryl Sandberg, the chief operating officer.
many tricky international situations over the years, including in Russia, Vietnam and Myanmar, where it has had to consider whether it would be shut down if it did not work with governments. That has led to the employee dissent, which has begun spilling into public view.
That became evident with India. In April, as Covid-19 cases soared in the country, Mr. Modi’s government called for roughly 100 social media posts on Facebook, Instagram and Twitter to be pulled down. Many of the posts included critiques of the government from opposition politicians and calls for Mr. Modi’s resignation.
Facebook removed some of the posts and briefly blocked a hashtag, #ResignModi. The company later said the hashtag had been banned by mistake and was not part of a government request.
But internally, the damage was done. In online chat rooms dedicated to human rights issues and global policy, employees described how disappointed they were with Facebook’s actions. Some shared stories of family members in India who were worried they were being censored.
Last month, when violence broke out between Israelis and Palestinians, reports surfaced that Facebook had erased content from Palestinian activists. Facebook’s Instagram app also briefly banned the #AlAqsa hashtag, a reference to Al Aqsa Mosque, one of Islam’s holiest sites. Facebook later explained that it had confused the #AlAqsa hashtag with a Palestinian militant group called Al Aqsa Martyrs Brigade.
Understand the Covid Crisis in India
Employees bristled. “We are responding to people’s protests about censoring with more censoring?” one wrote in an internal message, which was reviewed by The Times.
Nick Clegg, who leads public affairs, to explain the company’s role in removing content tied to the Israeli-Palestinian conflict, according to attendees. The employee called the situation in Israel “fraught” and asked how Facebook was going “to get it right” with content moderation.
Mr. Clegg ran through a list of policy rules and plans going forward, and assured staff that moderation would be treated with fairness and responsibility, two people familiar with the meeting said. The discussion was cordial, one of the people said, and comments in the chat box beside Mr. Clegg’s response were largely positive.
But some employees were dissatisfied, the people said. As Mr. Clegg spoke, they broke off into private chats and workplace groups, known as Tribes, to discuss what to do.
Dozens of employees later formed a group to flag the Palestinian content that they said had been suppressed to internal content moderation teams, said two employees. The goal was to have the posts reinstated online, they said.
Members of Facebook’s policy team have tried calming the tensions. In an internal memo in mid-May, which was reviewed by The Times, two policy team members wrote to other employees that they hoped “that Facebook’s internal community will resist succumbing to the division and demonization of the other side that is so brutally playing itself out offline and online.”
One of them was Muslim, and the other was Jewish, they said.
“We don’t always agree,” they wrote. “However, we do some of our best work when we assume good intent and recognize that we are on the same side trying to serve our community in the best possible way.”
Attorneys general for 44 states and jurisdictions called on Facebook to halt plans to create a version of Instagram for young children, citing concerns over mental and emotional well-being, exposure to online predators and cyberbullying.
In a letter on Monday to Facebook’s chief executive, Mark Zuckerberg, the prosecutors warned that social media can be harmful to children and that the company had a poor record of protecting children online. Facebook, which bought the photo-sharing app Instagram in 2012, currently has a minimum age requirement of 13 to use its products. According to federal children’s privacy rules, companies must ask parents for permission to collect data on users younger than 13.
The law enforcement officials pointed to research showing how the use of social media, including Instagram, has led to an increase in mental distress, body image concerns and even suicidal thoughts. A children’s version of Instagram doesn’t fill a need beyond the company’s commercial ambitions, the officials said in the letter.
“Without a doubt, this is a dangerous idea that risks the safety of our children and puts them directly in harm’s way,” Letitia James, New York’s attorney general, said in a statement. “There are too many concerns to let Facebook move forward with this ill-conceived idea, which is why we are calling on the company to abandon its launch of Instagram Kids.”
Facebook defended its plans, saying its development of a children’s version of Instagram would have safety and privacy in mind. It wouldn’t show ads on the app, the company vowed.
“As every parent knows, kids are already online,” Andy Stone, a Facebook spokesman, said in a statement. “We want to improve this situation by delivering experiences that give parents visibility and control over what their kids are doing.”
Mary Beth Meehan is an independent photographer and writer. Fred Turner is a professor of communication at Stanford University.
The workers of Silicon Valley rarely look like the men idealized in its lore. They are sometimes heavier, sometimes older, often female, often darker skinned. Many migrated from elsewhere. And most earn far less than Mark Zuckerberg or Tim Cook.
This is a place of divides.
As the valley’s tech companies have driven the American economy since the Great Recession, the region has remained one of the most unequal in the United States.
During the depths of the pandemic, four in 10 families in the area with children could not be sure that they would have enough to eat on any given day, according to an analysis by the Silicon Valley Institute for Regional Studies. Just months later, Elon Musk, the chief executive of Tesla, who recently added “Technoking” to his title, briefly became the world’s richest man. The median home price in Santa Clara County — home to Apple and Alphabet — is now $1.4 million, according to the California Association of Realtors.
For those who have not been fortunate enough to make billionaire lists, for midlevel engineers and food truck workers and longtime residents, the valley has become increasingly inhospitable, testing their resilience and resolve.
Seeing Silicon Valley,” from which this photo essay is excerpted.
Ravi and Gouthami
it would give $1 billion in loans, grants and land toward creating more affordable housing in the area. Of that pledge, $25 million would go toward building housing for educators: 120 apartments, including for Konstance and the other teachers in the original pilot as long as they were working in nearby schools.
At the time of the announcement, Facebook said the money would be used over the next decade. Construction on the teacher housing has yet to be completed.
One day Geraldine received a phone call from a friend: “They’re taking our churches!” her friend said. It was 2015, when Facebook was expanding in the Menlo Park neighborhood where she lived. Her father-in-law had established a tiny church here 55 years before, and Geraldine, a church leader, couldn’t let it be torn down. The City Council was holding a meeting for the community that night. “So I went to the meeting,” she said. “You had to write your name on a paper to be heard, so I did that. They called my name and I went up there bravely, and I talked.”
Geraldine doesn’t remember exactly what she said, but she stood up and prayed — and, ultimately, the congregation was able to keep the church. “God really did it,” she said. “I didn’t have nothing to do with that. It was God.”
In 2016, Gee and Virginia bought a five-bedroom house in Los Gatos, a pricey town nestled beside coastal foothills. Houses on their street cost just under $2 million at the time, and theirs was big enough for each of their two children to have a bedroom and for their parents to visit them from Taiwan.
Together, the couple earn about $350,000 a year — more than six times the national household average. Virginia works in the finance department of Hewlett-Packard in Palo Alto, and Gee was an early employee of a start-up that developed an online auctioning app.
They have wanted to buy nice furniture for the house, but between their mortgage and child care expenses, they don’t think they can afford to buy it all at once. Some of their rooms now sit empty. Gee said that Silicon Valley salaries like theirs sounded like real wealth to the rest of the country, but that here it didn’t always feel that way.
Jon lives in East Palo Alto, a traditionally lower-income area separated from the rest of Silicon Valley by Highway 101.
By the time Jon was in the eighth grade he knew he wanted to go to college, and he was accepted by a rigorous private high school for low-income children. He discovered an aptitude for computers, and excelled in school and professional internships. Yet as he advanced in his career, he realized that wherever he went there were very few people who looked like him.
“I got really troubled,” he said. “I didn’t know who to talk to, and I saw that it wasn’t a problem for them. I was just like ‘I need to do something about this.’”
Jon, now in his 30s, has come back to East Palo Alto, where he has developed maker spaces and brought tech-related education projects to members of the community.
“It is amazing living here,” said Erfan, who moved to Mountain View when her husband got a job as an engineer at Google. “But it’s not a place I want to spend my whole life. There are lots of opportunities for work, but it’s all about the technology, the speed for new technology, new ideas, new everything.” The couple had previously lived in Canada after emigrating from Iran.
“We never had these opportunities back home, in Iran. I know that — I don’t want to complain,” she added. “When I tell people I’m living in the Bay Area, they say: ‘You’re so lucky — it must be like heaven! You must be so rich.’”
But the emotional toll can be weighty. “We are sometimes happy, but also very anxious, very stressed. You have to be worried if you lose your job, because the cost of living is very high, and it’s very competitive. It’s not that easy — come here, live in California, become a millionaire. It’s not that simple. ”
Elizabeth studied at Stanford and works as a security guard for a major tech firm in the area. She is also homeless.
Sitting on a panel about the issue at San Jose State University in 2017, she said, “Please remember that many of the homeless — and there are many more of us than are captured in the census — work in the same companies that you do.” (She declined to disclose which company she worked for out of fear of reprisal.)
While sometimes homeless co-workers may often serve food in cafeterias or clean buildings, she added, many times they’re white-collar professionals.
“Sometimes it takes only one mistake, one financial mistake, sometimes it takes just one medical catastrophe. Sometimes it takes one tiny little lapse in insurance — it can be a number of things. But the fact is that there’s lots of middle-class people that fell into poverty very recently,” she said. “Their homelessness that was just supposed to be a month or two months until they recovered, or three months, turns out to stretch into years. Please remember, there are a lot of us.”
Facebook’s suspension of Donald Trump will continue for now, the company announced yesterday. But it still has not resolved the central problem that Trump has created for social media platforms and, by extension, American democracy.
The problem is that Trump lies almost constantly. Unlike many other politicians — including other recent presidents, from both parties — he continues to make false statements even after other people have documented their falseness. This behavior undermines the healthy functioning of American democracy, particularly because Trump has such a large following.
His lies about the 2020 election are the clearest example. They have led tens of millions of people to believe a made-up story about how Joe Biden won. They have become a loyalty test within the Republican Party.
In Congress, Republicans are moving to oust Liz Cheney as one of their leaders after she said that people who repeated Trump’s “big lie” were “turning their back on the rule of law, and poisoning our democratic system.” In several states, Republican legislators are using Trump’s made-up story to justify new laws that make voting more difficult, especially in heavily Democratic areas. There is a direct connection between Trump’s lies about the election and the weakening of voting rights.
justified its suspension of Trump in January not based on his lies but instead on his incitement of violence, before and during the Jan. 6 attack on the Capitol by his supporters. Facebook continues to allow politicians to spread many falsehoods, saying it does not want to police truth. Distinguishing among truth, opinion and falsehood can indeed be tricky — but Trump’s claims about electoral theft are not a nuanced case.
The issue here isn’t the enduring philosophical question of what constitutes truth; it’s whether Facebook is willing to tolerate obvious and influential lies. So far, the company has decided that it is. It has drawn a line somewhere between blatant untruths and incitement to violence.
“Facebook’s approach to Trump’s attempts to undermine confidence in the integrity of the election was weak and ineffective,” Richard Hasen, a law professor at the University of California, Irvine, told me. When Trump last year falsely described mail-in voting as corrupt, for example, Facebook left up the post and instead added a link to a website where people could find general election information, as Hasen describes in his forthcoming book, “Cheap Speech.” Twitter, he notes, has taken a more aggressive position.
Yesterday’s decision officially came from a Facebook-appointed panel of speech experts that the company calls its Oversight Board. The board has no actual power to regulate the company, but it may have some influence on Facebook executives. In their statement, board members criticized Facebook for levying an indefinite suspension on Trump and said it should choose in the next six months between a permanent ban and a time-limited one: “In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities,” the board wrote.
points out in her latest newsletter). The board wrote:
… context matters when assessing issues of causality and the probability and imminence of harm. What is important is the degree of influence that a user has over other users. When posts by influential users pose a high probability of imminent harm, as assessed under international human rights standards, Facebook should take action to enforce its rules quickly.
That passage highlights the crux of the issue. Facebook has evidently decided that undermining the credibility of democratic elections does not violate international human rights standards. If it maintains that position, Trump may be back on Facebook six months from now.
What is the Facebook Oversight Board? Cecilia Kang has written an explainer, and Ben Smith has written a column.
The board’s message to Facebook’s C.E.O., Mark Zuckerberg: “This problem is yours,” Kevin Roose writes.
Nick Clegg, a former deputy prime minister in Britain, is steering the company’s response. Read a profile of him.
How the suspension has mattered: Politico’s Michael Kruse traveled to Ohio recently and was struck by how little many Republican voters knew about Trump’s recent comments.
THE LATEST NEWS
The Biden administration supported waiving patents for Covid vaccines to boost supply in lower-income countries.
Support from the White House is not a guarantee that a waiver will be adopted. It needs support from all members of the World Trade Organization.
The E.U. is considering whether to follow the Biden administration’s decision.
Supporters cast the move as a moral imperative that would get shots to India and other countries.
Pharmaceutical companies reacted angrily, saying it would hamper future vaccine development and do little to increase short-term supply.
Jim Geraghty and the political scientists Frances Lee and James Curry in The Atlantic.
Biden’s economic plans address one of the New Deal’s glaring omissions: women, Binyamin Appelbaum writes in The Times.
Lost and Found: His ship vanished 176 years ago. DNA offered his descendants a clue.
A Times Classic: What happened to Bob Ross’s paintings? We found them.
Lives Lived: Tamara Press was a dominant Soviet shot-putter and discus thrower in the 1960s. But amid questions about her physique, she pulled out of a major event that required sex testing. She died at 83.
ARTS AND IDEAS
Amanda Hess writes in The Times: The former “Jeopardy!” champion Ken Jennings infused shows with a Trebek-like intellect, while the Packers quarterback Aaron Rodgers brought an outsider’s earnestness. Others, like Dr. Mehmet Oz, have worked less well, trying to outshine the show with stories and jokes.
petition calling for LeVar Burton, the former star of “Reading Rainbow,” to be the next host received more than 250,000 signatures — and helped get him a guest spot beginning July 26.
One strategic wrinkle: Contestants seem to be struggling to adapt to the variation in the hosts’ speaking styles and aren’t sure exactly when to buzz in, Claire McNear notes in The Ringer. That has created a randomness that has prevented any long winning streaks.
Regardless of who gets the permanent job, Hess argues that the clues, “which are precisely written and briskly dealt,” are the show’s real draw.
PLAY, WATCH, EAT
What to Cook
today’s Mini Crossword, and a clue: Vuitton of fashion (five letters).
If you’re in the mood to play more, find all our games here.
Thanks for spending part of your morning with The Times. See you tomorrow. — David
P.S. Tuesday night’s episode of “Jeopardy!” featured an answer about The Times. (Scroll to the bottom for the solution.)
Lawmakers lashed out at the Facebook Oversight Board’s ruling on Wednesday to uphold the social network’s ban on former President Donald J. Trump, at least for now.
Driving the discontent was that the Oversight Board, a quasi-court that confers over some of Facebook’s content decisions, did not make a black-and-white decision about the case. Mr. Trump had been blocked from the social network in January after his comments online and elsewhere incited the storming of the Capitol building.
While the Oversight Board said on Wednesday that Facebook was justified in suspending Mr. Trump at the time because of the risk of further violence, it also said the company needed to revisit its action. The board said Facebook’s move was “a vague, standardless penalty” without defined limits, which needed to be reviewed again for a final decision on Mr. Trump’s account in six months.
That angered both Republicans and Democrats. Republican lawmakers have pointed to Mr. Trump’s ouster by Facebook, Twitter and others as evidence of an alleged anti-conservative campaign by tech companies, calling the decisions a dangerous precedent for censorship of political figures.
Senator Ted Cruz, Republican of Texas, tweeted that the board’s decision on Wednesday was “disgraceful” and warned it could have dangerous ripple effects.
“For every liberal celebrating Trump’s social media ban, if the Big Tech oligarchs can muzzle the former President, what’s to stop them from silencing you?” Mr. Cruz said in his tweet.
Senator Marsha Blackburn, Republican of Tennessee, said in a statement that the move showed that “it’s clear that Mark Zuckerberg views himself as the arbiter of free speech.” Republican members of the House judiciary committee tweeted that the decision was “pathetic,” and Jim Jordan of Ohio, the ranking member, tweeted about Facebook: “Break them up.”
Democrats, also dissatisfied with the murky decision, took aim at how Facebook can be used to spread lies. Frank Pallone, the chairman of the House energy and commerce committee, tweeted: “Donald Trump has played a big role in helping Facebook spread disinformation, but whether he’s on the platform or not, Facebook and other social media platforms with the same business model will find ways to highlight divisive content to drive advertising revenues.”
Representative Ken Buck, Republican of Colorado and the ranking member of the House antitrust subcommittee, accused the Oversight Board of political bias.
“Facebook made an arbitrary decision based on its political preferences, and the Oversight Board, organized and funded by Facebook, reaffirmed its decision,” he said.
But scholars who support free speech welcomed the decision. They have warned that as social media companies become more active in determining what stays online and what doesn’t, that could potentially lead to a slippery slope where tech giants have too much sway over digital speech.
“The Facebook Oversight Board has said what many critics noted — the ban of former President Trump, while perhaps justified, was worrisome in its open-endedness and lack of process,” said Gautam Hans, a law professor at Vanderbilt University. “To the degree that the decision draws attention to how ad hoc, manipulable, and arbitrary Facebook’s own content policies get enforced, I welcome it.”