In 2010, Accenture scored an accounting contract with Facebook. By 2012, that had expanded to include a deal for moderating content, particularly outside the United States.
That year, Facebook sent employees to Manila and Warsaw to train Accenture workers to sort through posts, two former Facebook employees involved with the trip said. Accenture’s workers were taught to use a Facebook software system and the platform’s guidelines for leaving content up, taking it down or escalating it for review.
What started as a few dozen Accenture moderators grew rapidly.
By 2015, Accenture’s office in the San Francisco Bay Area had set up a team, code-named Honey Badger, just for Facebook’s needs, former employees said. Accenture went from providing about 300 workers in 2015 to about 3,000 in 2016. They are a mix of full-time employees and contractors, depending on the location and task.
The firm soon parlayed its work with Facebook into moderation contracts with YouTube, Twitter, Pinterest and others, executives said. (The digital content moderation industry is projected to reach $8.8 billion next year, according to Everest Group, roughly double the 2020 total.) Facebook also gave Accenture contracts in areas like checking for fake or duplicate user accounts and monitoring celebrity and brand accounts to ensure they were not flooded with abuse.
After federal authorities discovered in 2016 that Russian operatives had used Facebook to spread divisive posts to American voters for the presidential election, the company ramped up the number of moderators. It said it would hire more than 3,000 people — on top of the 4,500 it already had — to police the platform.
“If we’re going to build a safe community, we need to respond quickly,” Mr. Zuckerberg said in a 2017 post.
The next year, Facebook hired Arun Chandra, a former Hewlett Packard Enterprise executive, as vice president of scaled operations to help oversee the relationship with Accenture and others. His division is overseen by Ms. Sandberg.
Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.
The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.
Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.
Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.
pays them through a trust.
The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.
In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.
A spokesman for the Oversight Board declined to comment.
Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.
bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.
The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.
The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.
Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.
There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.
“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”
Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter.
An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.
Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.”
When Facebook this week released its first quarterly report about the most viewed posts in the United States, Guy Rosen, its vice president of integrity, said the social network had undertaken “a long journey” to be “by far the most transparent platform on the internet.” The list showed that the posts with the most reach tended to be innocuous content like recipes and cute animals.
Facebook had prepared a similar report for the first three months of the year, but executives never shared it with the public because of concerns that it would look bad for the company, according to internal emails sent by executives and shared with The New York Times.
In that report, a copy of which was provided to The Times, the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.
The report was nearing public release when some executives, including Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, debated whether it would cause a public relations problem, according to the internal emails. The company decided to shelve it.
called on the company to share more information about false and misleading information on the site, and to do a better job of stopping its spread. Last month, President Biden accused the company of “killing people” by allowing false information to circulate widely, a statement the White House later softened. Other federal agencies have accused Facebook of withholding key data.
Facebook has pushed back, publicly accusing the White House of scapegoating the company for the administration’s failure to reach its vaccination goals. Executives at Facebook, including Mark Zuckerberg, its chief executive, have said the platform has been aggressively removing Covid-19 misinformation since the start of the pandemic. The company said it had removed over 18 million pieces of misinformation in that period.
But Brian Boland, a former vice president of product marketing at Facebook, said there was plenty of reason to be skeptical about data collected and released by a company that has had a history of protecting its own interests.
barred from advertising on Facebook because of its repeated violations of the platform’s political advertising policy.
Trending World, according to the report, was viewed by 81.4 million accounts, slightly fewer than the 18th-most-popular page, Fox News, which had 81.7 million content viewers for the first three months of 2021.
Facebook’s transparency report released on Wednesday also showed that an Epoch Times subscription link was among the most viewed in the United States. With some 44.2 million accounts seeing the link in April, May and June, it was about half as popular as Trending World in the shelved report.
Sheera Frenkel and Mike Isaac contributed reporting. Jacob Silver and Ben Decker contributed research.
In the past few weeks, the vast majority of the most highly engaged social media posts containing coronavirus misinformation were from people who had risen to prominence by questioning the vaccines in the past year.
In July, the right-wing commentator Candace Owens jumped on the misstatement from Britain’s scientific adviser. “This is shocking!” she wrote. “60% of people being admitted to the hospital with #COVID19 in England have had two doses of a coronavirus vaccine, according to the government’s chief scientific adviser.”
After the scientific adviser, Patrick Vallance, corrected himself, Ms. Owens added the correct information at the bottom of her Facebook post. But the post was liked or shared over 62,000 times — two-thirds of its total interactions — in the three hours before her update, a New York Times analysis found. In all, the rumor collected 142,000 likes and shares on Facebook, most of them coming from Ms. Owens’s post, according to a report by the Virality Project, a consortium of misinformation researchers from outfits like the Stanford Internet Observatory and Graphika.
When reached for comment, Ms. Owens said in an email: “Unfortunately, I’m not interested in The New York Times. The people that follow me don’t take your hit pieces seriously.”
Also in July, Thomas Renz, a lawyer, appeared in a video claiming that 45,000 people had died from coronavirus vaccines. The claim, since debunked, relies on unverified information from the Vaccine Adverse Event Reporting System, a government database. The baseless claim had been included in a lawsuit that Mr. Renz filed on behalf of an anonymous “whistle-blower,” in coordination with America’s Frontline Doctors — a right-wing group that spread misinformation about the pandemic in the past.
Mr. Renz’s video got more than 19,000 views on Bitchute. The unfounded claim was repeated by the top Spanish-language Telegram channels, Facebook groups and the conspiracy website Infowars, collecting over 120,000 views across the platforms, according to the Virality Project.
In an email, Mr. Renz said his practice had “performed the due diligence necessary” to believe in the accuracy of the allegations in the lawsuit he had filed. “We actually do not believe that the Biden administration is responsible for this, rather we believe that President Biden, like President Trump before him, was misled by the same group of conflicted bureaucrats,” Mr. Renz said.
In March, the White House also orchestrated an Instagram Live chat between Dr. Fauci and Eugenio Derbez, a Mexican actor with over 16.6 million Instagram followers who had been openly doubtful of the vaccines. During their 37-minute discussion, Mr. Derbez was upfront about his concerns.
“What if I get the vaccine, but it doesn’t protect me against the new variant?” he asked. Dr. Fauci acknowledged that the vaccines might not completely shield people from variants, but said, “It’s very, very good at protecting you from getting seriously ill.”
Understand the State of Vaccine Mandates in the U.S.
Mr. Flaherty said the whole point of the campaign was to be “a positive information effort.”
State and local governments have taken the same approach, though on a smaller scale and sometimes with financial incentives.
In February, Colorado awarded a contract worth up to $16.4 million to the Denver-based Idea Marketing, which includes a program to pay creators in the state $400 to $1,000 a month to promote the vaccines.
Jessica Bralish, the communications director at Colorado’s public health department, said influencers were being paid because “all too often, diverse communities are asked to reach out to their communities for free. And to be equitable, we know we must compensate people for their work.”
As part of the effort, influencers have showed off where on their arms they were injected, using emojis and selfies to punctuate the achievement. “I joined the Pfizer club,” Ashley Cummins, a fashion and style influencer in Boulder, Colo., recently announced in a smiling selfie while holding her vaccine card. She added a mask emoji and an applause emoji.
“Woohoo! This is so exciting!” one fan commented.
Posts by creators in the campaign carry a disclosure that reads “paid partnership with Colorado Dept. of Public Health and Environment.”
Over the last decade, Dr. Mercola has built a vast operation to push natural health cures, disseminate anti-vaccination content and profit from all of it, said researchers who have studied his network. In 2017, he filed an affidavit claiming his net worth was “in excess of $100 million.”
And rather than directly stating online that vaccines don’t work, Dr. Mercola’s posts often ask pointed questions about their safety and discuss studies that other doctors have refuted. Facebook and Twitter have allowed some of his posts to remain up with caution labels, and the companies have struggled to create rules to pull down posts that have nuance.
“He has been given new life by social media, which he exploits skillfully and ruthlessly to bring people into his thrall,” said Imran Ahmed, director of the Center for Countering Digital Hate, which studies misinformation and hate speech. Its “Disinformation Dozen” report has been cited in congressional hearings and by the White House.
In an email, Dr. Mercola said it was “quite peculiar to me that I am named as the #1 superspreader of misinformation.” Some of his Facebook posts were only liked by hundreds of people, he said, so he didn’t understand “how the relatively small number of shares could possibly cause such calamity to Biden’s multibillion dollar vaccination campaign.”
The efforts against him are political, Dr. Mercola added, and he accused the White House of “illegal censorship by colluding with social media companies.”
He did not address whether his coronavirus claims were factual. “I am the lead author of a peer reviewed publication regarding vitamin D and the risk of Covid-19 and I have every right to inform the public by sharing my medical research,” he said. He did not identify the publication, and The Times was unable to verify his claim.
A native of Chicago, Dr. Mercola started a small private practicein 1985 in Schaumburg, Ill. In the 1990s, he began shifting to natural health medicine and opened his main website, Mercola.com, to share his treatments, cures and advice. The site urges people to “take control of your health.”
In an emailed statement, Mr. Trump said Facebook’s ruling was “an insult to the record-setting 75M people, plus many others, who voted for us in the 2020 Rigged Presidential Election.” He added that Facebook should not be allowed to get away with “censoring and silencing” him and others on the platform.
Facebook’s broader shift to no longer automatically exempt speech by politicians from its rules is a stark reversal from a free-speech position that Mark Zuckerberg, the company’s chief executive, had championed. In a 2019 address at Georgetown University, Mr. Zuckerberg said, “People having the power to express themselves at scale is a new kind of force in the world — a Fifth Estate alongside the other power structures of society.”
But that stance drew criticism from lawmakers, activists and Facebook’s own employees, who said the company allowed misinformation and other harmful speech from politicians to flow unhindered.
While many academics and activists welcomed Facebook’s changes on Friday as a step in the right direction, they said the implementation of the new rules would be tricky. The company would likely enter into a complicated dance with global leaders who had grown accustomed to receiving special treatment by the platform, they said.
“This change will result in speech by world leaders being subject to more scrutiny,” said David Kaye, a law professor and former United Nations monitor for freedom of expression. “It will be painful for leaders who aren’t used to the scrutiny, and it will also lead to tensions.”
Countries including India, Turkey and Egypt have threatened to take action against Facebook if it acts against the interests of the ruling parties, Mr. Kaye said. The countries have said they might punish Facebook’s local staff or ban access to the service, he said.
“This decision by Facebook imposes new political calculations for both these global leaders, and for Facebook,” Mr. Kaye said.
This is a developing story. Check back for updates.
The officers from India’s elite antiterrorism police unit descended after dusk on the New Delhi offices of Twitter, with television news cameras in tow. Their mission: Start an argument over fake news.
The offices sat empty, closed amid India’s devastating coronavirus outbreak. And the police acknowledged that they were there to deliver nothing more legally binding than a notice disputing a warning label that Twitter had assigned to some tweets.
But symbolically, the visit by the police on Monday night sent a clear message that India’s powerful ruling party is becoming increasingly upset with Twitter because of the perception that the company has sided with critics of the government. As anger has risen across the country over India’s stumbling response to the pandemic, the government of Prime Minister Narendra Modi and his Bharatiya Janata Party have struggled to control the narrative.
As a result, top Indian political leaders have applied increasing pressure on Twitter, Facebook and other platforms that people are using to air their complaints. In doing so, they are following the path of some other countries trying to control how and where messages can spread on social media. In March, for example, the Russian government said it would slow access to Twitter, one of the few places where Russians openly criticize the government.
blocked the accounts of 500 people accused of making inflammatory remarks about Mr. Modi.
India banned TikTok, WeChat and dozens of other Chinese apps, citing national security concerns.
Though Mr. Modi’s government controls the Delhi police, it was not clear on Tuesday that the failed mission at the Twitter office had happened at its behest.
Understand the Covid Crisis in India
A B.J.P. spokesman did not immediately respond to a request for comment. A Twitter spokeswoman asked for questions in an email, which went unanswered.
On May 18, a B.J.P. spokesman, Sambit Patra, tweeted the picture of a document he described as plans by the Indian National Congress, the main opposition party, for making the government look bad.
Mr. Patra’s message was retweeted more than 5,000 times, including by ministers in Mr. Modi’s government and party leaders.
Harsh Vardhan, India’s health minister, used the hashtag #CongressToolkitExposed to rip into the opposition party.
“It’s deplorable on their part to attempt to spread misinformation during this global catastrophe just to swell their dwindling political fortunes at the expense of people’s suffering,” Dr. Vardhan tweeted.
India’s decision to export vaccines abroad.
The posters were made by the ruling party in Delhi, another party in opposition to the B.J.P., according to a party member, Durgesh Pathak.
“In a democracy, to ask a question is not wrong,” Mr. Pathak said. “I am not abusing anybody. I am not instigating anybody for violence. I am not asking anybody to do any wrong thing. I am asking a question to the prime minister of my country.”
WASHINGTON — Florida on Monday became the first state to regulate how companies like Facebook, YouTube and Twitter moderate speech online, by imposing fines on social media companies that permanently ban political candidates for statewide office.
The new law, signed by Gov. Ron DeSantis, is a direct response to Facebook and Twitter’s ban of former President Donald J. Trump in January. In addition to the fines for banning candidates, it also makes it illegal to prevent some news outlets from posting to their platforms in response to the contents of their stories.
Mr. DeSantis said that signing the bill meant that Floridians would be “guaranteed protection against the Silicon Valley elites.”
“If Big Tech censors enforce rules inconsistently, to discriminate in favor of the dominant Silicon Valley ideology, they will now be held accountable,” he said in a statement.
limiting the right to protest and providing immunity to drivers who strike protesters in public streets.
And the Republican push to make voting harder continues unabated after Mr. Trump’s relentless lying about the results of the 2020 election. Georgia Gov. Brian Kemp signed into law new restrictions on voting, as did Mr. DeSantis in Florida, and Texas Republicans are poised to soon pass the nation’s biggest rollback of voting rights.
The party-wide, nationwide push stems from Mr. Trump’s repeated grievances. During his failed re-election campaign, Mr. Trump repeatedly pushed to repeal Section 230 of the Communications Decency Act, which provides immunity to certain tech firms from liability for user-generated content, even as he used their platforms to spread misinformation. Twitter and Facebook eventually banned Mr. Trump after he inspired his supporters, using their platforms, to attack the Capitol on Jan. 6.
Republican lawmakers in Florida have echoed Mr. Trump’s rhetoric.
“I have had numerous constituents come to me saying that they were banned or de-platformed on social media sites,” said Representative Blaise Ingoglia during the debate over the bill.
But Democrats, libertarian groups and tech companies all say that the law violates the tech companies’ First Amendment rights to decide how to handle content on their own platforms. It also may prove impossible to bring complaints under the law because of Section 230, the legal protections for web platforms that Mr. Trump has attacked.
“It is the government telling private entities how to speak,” said Carl Szabo, the vice president at NetChoice, a trade association that includes Facebook, Google and Twitter as members. “In general, it’s a gross misreading of the First Amendment.” He said the First Amendment was designed to protect sites like Reddit from government intervention, not protect “politicians from Reddit.”
The Florida measure will likely be challenged in court, said Jeff Kosseff, a professor of cybersecurity law at the United States Naval Academy.
“I think this is the beginning of testing judges’ limits on these sorts of restrictions for social media,” he said.
When reports began to emerge on Wednesday night that the murderous leader of the Islamist terrorist group Boko Haram was dead, many Nigerians dismissed them immediately.
Over the years, the Nigerian military had announced the killing of that leader, Abubakar Shekau, several times before. And then he would show up online weeks later, taunting his supposed killers in video diatribes.
“If you have killed us, why are we still alive?” he asked in 2018, after the Nigerian military claimed to have “broken the heart and the soul” of Boko Haram, a group that has killed tens of thousands of people and displaced millions.
But this time feels different. It wasn’t the military announcing they had killed him. In fact, for hours on Wednesday night and on Thursday, the military was silent.
the 2014 kidnapping of the Chibok Girls, 276 schoolgirls who were abducted from their dormitories at night and who Mr. Shekau later vowed he would “sell in the market.”
over 100 are missing or remain in captivity, along with many other less famous, but often even younger victims.
Bunu Bukar, secretary of the Hunters’ Association in Borno State, who has played a key role in demobilizing Boko Haram fighters and is in contact with past and present members of the group. He said that 200 heavily armed ISWAP members descended on Mr. Shekau’s hide-out in Sambisa forest.
“When Shekau discovered that these people are very powerful and he also realized that it’s not Nigerian army, it’s ISWAP — he just planned to use explosive devices,” Mr. Bukar said. “He wore them all and confronted them directly. When the explosion came, Shekau was in pieces. And they also lost at least 40 fighters — ISWAP fighters.”
wrote Ahmad Salkida, the Nigerian journalist often credited with — and sometimes criticized for — having stellar sources inside Boko Haram.
In Maiduguri, people gathered in small groups to talk about the news, but most assigned it no greater status than another rumor. Likely a false alarm.
How do we fight disinformation? Join Times tech reporters as they untangle the roots of disinformation and how to combat it. Plus we speak to special guest comedian Sarah Silverman. R.S.V.P. to this subscriber-exclusive event. But Mr. Shekau and his group would have an indelible effect on Mr. Hamza, who had to flee Maiduguri for two years, and his family.“I lost a brother, a cousin and an uncle killed by Boko Haram,” he said. “Thousands of innocent people killed or displaced, especially women and children. How can God forgive such a heartless person?”For many, particularly those connected with the country’s armed forces, if Mr. Shekau was dead, it was not necessarily a positive development overall. It could mean that ISWAP, already powerful, posed much more of a threat to Maiduguri and other garrison cities, some said.If it really happened, “Shekau’s death is not an end to Boko Haram. It is only the beginning of another chapter in the group,” said Audu Bulama Bukarti, an expert on extremist groups in Africa at the Tony Blair Institute for Global Change.Warfare between the factions has killed hundreds of their members previously, he said, and if that continued, they would be weakened.“It will be two violent groups eating up themselves and that will be positive news for Nigeria,” he said. On the other hand, if the two factions teamed up, he said: “It will open an even deadlier chapter for security forces.”It would also make it harder to win the battle of ideas, he said, as ISWAP tends to be more benign to civilians.“Where Shekau alienated civilians with his capricious and often massive and violent seizures of cattle and grain, ISWAP has substituted a fairer, cash-based taxation of trade and agricultural production,” wrote the analyst Vincent Foucher in a recent report for the International Crisis Group.
Those who have suffered at Mr. Shekau’s hands almost hoped he had not been killed in the way it was reported on Thursday, feeling it was too easy a way out for him.
“I would have wished that he was caught alive, released to the military authorities and taken round the city of Maiduguri,” Mr. Hamza said. “We would surely have skinned him alive.”