Some of the projects that Mr. Armstrong promoted were small-time, experimental crypto ventures that eventually encountered problems. In those cases, he said, he considered himself a victim, too.
“They’re preying on the novice crypto influencer who just got popular and is trying to figure out what they should and shouldn’t be doing,” he said. “It’s hard to go from 12,000 followers to a million in one year and make all the right decisions.”
Expand Your Cryptocurrency Vocabulary
Card 1 of 9
Bitcoin. A Bitcoin is a digital token that can be sent electronically from one user to another, anywhere in the world. Bitcoin is also the name of the payment network on which this form of digital currency is stored and moved.
Blockchain. A blockchain is a database maintained communally and that reliably stores digital information. The original blockchain was the database on which all Bitcoin transactions were stored, but non-currency-based companies and governments are also trying to use blockchain technology to store their data.
Coinbase. The first major cryptocurrency company to list its shares on a U.S. stock exchange, Coinbase is a platform that allows people and companies to buy and sell various digital currencies, including Bitcoin, for a transaction fee.
Web3. The name “web3” is what some technologists call the idea of a new kind of internet service that is built using blockchain-based tokens, replacing centralized, corporate platforms with open protocols and decentralized, community-run networks.
DAOs. A decentralized autonomous organization, or DAO, is an organizational structure built with blockchain technology that is often described as a crypto co-op. DAOs form for a common purpose, like investing in start-ups, managing a stablecoin or buying NFTs.
Mr. Paul rose to fame as a video blogger and an occasional actor; YouTube once reprimanded him for publishing footage of a dead body he found in a Japanese forest. Over the years, he has parlayed his internet fame into an eclectic array of entrepreneurial pursuits, including a line of energy drinks.
Mr. Paul became interested in crypto last year as the market for NFTs started booming. In a recent interview, he acknowledged that he was still learning how to navigate the crypto market, even as he tried to profit from the technology. “I’m an extreme ideas person, not much of an executor,” he said.
Mr. Paul was involved in some of the initial brainstorming for the Dink Doink project. But the venture was ultimately spearheaded by one of his roommates, Jake Broido, who gave Mr. Paul 2.5 percent of the tokens that were initially issued.
In a tweet last June, Mr. Paul called it one of the “dumbest, most ridiculous” cryptocurrencies he had encountered, and circulated a video of a cartoon character singing sexually explicit lyrics. “That’s why I’m all in,” he added. He also appeared in a shaky-cam video on Telegram in which he hailed Dink Doink as possibly his favorite crypto investment.
The campaign was a flop, and Mr. Paul was pilloried by YouTube critics. The price of Dink Doink hovered well below a cent, before falling even further in value over the summer. Mr. Paul said he had never sold his tokens or profited from the project. But he said he regretted promoting the coin without disclosing his financial stake. “I definitely didn’t act as responsibly as I should have,” he said.
Many entertainment executives, tired of playing catch-up to a Silicon Valley interloper, have been waiting for the comeuppance of Netflix. But this may not have been the way they hoped it would happen.
Netflix said this week that it lost more subscribers than it signed up in the first three months of the year, reversing a decade of steady growth. The company’s shares nose-dived 35 percent on Wednesday while it shed about $50 billion in market capitalization. The pain was shared across the industry as the stock of companies like Disney, Warner Bros. Discovery and Paramount also declined.
Netflix blamed a number of issues, ranging from increased competition to its decision to drop all its subscribers in Russia because of the war in Ukraine. To entertainment executives and analysts, the moment felt decisive in the so-called streaming wars. After years of trying, they may see a chance to gain ground on their giant rival.
But Netflix’s stunning reversal also raised a number of questions that will have to be answered in the coming months as more traditional media companies race toward subscription businesses largely modeled after what Netflix created. Is there such a thing as too many streaming options? How many people are really willing to pay for them? And could this business be less profitable and far less reliable than what the industry has been doing for years?
advertising-supported tier in the next year or two. Netflix also said it would crack down on password sharing, a practice that in the past it said it had no problem with.
“We’ve been thinking about that for a couple of years, but when we were growing fast it wasn’t a high priority to work on,” Mr. Hastings said. “And now, we’re working superhard on it.”
Netflix has no advertising sales experience, while rivals like Disney, Warner Bros. Discovery and Paramount have vast advertising infrastructure. And the password crackdown led some analysts to wonder whether Netflix has already reached market saturation in the United States.
Mr. Hastings tried to reassure everyone that Netflix had been through tough times before and that it would solve its problems. He said the company was now “superfocused” on “getting back into our investors’ good graces.”
While Meta adjusts, some small businesses have begun seeking other avenues for ads. Shawn Baker, the owner of Baker SoftWash, an exterior cleaning company in Mooresville, N.C., said it previously took about $6 of Facebook ads to identify a new customer. Now it costs $27 because the ads do not find the right people, he said.
Mr. Baker has started spending $200 a month to advertise through Google’s marketing program for local businesses, which surfaces his website when people who live in the area search for cleaners. To compensate for those higher marketing costs, he has raised his prices 7 percent.
“You’re spending more money now than what you had to spend before to do the same things,” he said.
Other tech giants with first-party information are capitalizing on the change. Amazon, for example, has reams of data on its customers, including what they buy, where they reside, and what movies or TV shows they stream.
In February, Amazon disclosed the size of its advertising business — $31.2 billion in revenue in 2021 — for the first time. That makes advertising its third-largest source of sales after e-commerce and cloud computing. Amazon declined to comment.
Amber Murray, the owner of See Your Strength in St. George, Utah, which sells stickers online for people with anxiety, started experimenting with ads on Amazon after the performance of Facebook ads deteriorated. The results were remarkable, she said.
In February, she paid about $200 for Amazon to feature her products near the top of search results when customers looked up textured stickers. Sales totaled $250 a day and continued to grow, she said. When she spent $85 on a Facebook ad campaign in January, it yielded just $37.50 in sales, she said.
“I think the golden days of Facebook advertising are over,” Ms. Murray said. “On Amazon, people are looking for you, instead of you telling people what they should want.”
After war began last month, President Volodymyr Zelensky of Ukraine turned to Mykhailo Fedorov, a vice prime minister, for a key role.
Mr. Fedorov, 31, the youngest member of Mr. Zelensky’s cabinet, immediately took charge of a parallel prong of Ukraine’s defense against Russia. He began a campaign to rally support from multinational businesses to sunder Russia from the world economy and to cut off the country from the global internet, taking aim at everything from access to new iPhones and PlayStations to Western Union money transfers and PayPal.
To achieve Russia’s isolation, Mr. Fedorov, a former tech entrepreneur, used a mix of social media, cryptocurrencies and other digital tools. On Twitter and other social media, he pressured Apple, Google, Netflix, Intel, PayPal and others to stop doing business in Russia. He helped form a group of volunteer hackers to wreak havoc on Russian websites and online services. His ministry also set up a cryptocurrency fund that has raised more than $60 million for the Ukrainian military.
The work has made Mr. Fedorov one of Mr. Zelensky’s most visible lieutenants, deploying technology and finance as modern weapons of war. In effect, Mr. Fedorov is creating a new playbook for military conflicts that shows how an outgunned country can use the internet, crypto, digital activism and frequent posts on Twitter to help undercut a foreign aggressor.
McDonald’s have withdrawn from Russia, with the war’s human toll provoking horror and outrage. Economic sanctions by the United States, European Union and others have played a central role in isolating Russia.
Mr. Zelensky was elected in 2019, he appointed Mr. Fedorov, then 28, to be minister of digital transformation, putting him in charge of digitizing Ukrainian social services. Through a government app, people could pay speeding tickets or manage their taxes. Last year, Mr. Fedorov visited Silicon Valley to meet with leaders including Tim Cook, the chief executive of Apple.
Russia invaded Ukraine, Mr. Fedorov immediately pressured tech companies to pull out of Russia. He made the decision with Mr. Zelensky’s backing, he said, and the two men speak every day.
“I think this choice is as black and white as it ever gets,” Mr. Fedorov said. “It is time to take a side, either to take the side of peace or to take the side of terror and murder.”
On Feb. 25, he sent letters to Apple, Google and Netflix, asking them to restrict access to their services in Russia. Less than a week later, Apple stopped selling new iPhones and other products in Russia.
Russia damaged the country’s main telecommunications infrastructure. Two days after contacting Mr. Musk, a shipment of Starlink equipment arrived in Ukraine.
Since then, Mr. Fedorov said he has periodically exchanged text messages with Mr. Musk.
were put on pause following the invasion. Russia, a signatory to the accord, has tried to use final approval of the deal as leverage to soften sanctions imposed because of the war.
But while many companies have halted business in Russia, more could be done, he said. Apple and Google should pull their app stores from Russia and software made by companies like SAP was also being used by scores of Russian businesses, he has noted.
In many instances, the Russian government is cutting itself off from the world, including blocking access to Twitter and Facebook. On Friday, Russian regulators said they would also restrict access to Instagram and called Meta an “extremist” organization.
Some civil society groups have questioned whether Mr. Fedorov’s tactics could have unintended consequences. “Shutdowns can be used in tyranny, not in democracy,” the Internet Protection Society, an internet freedom group in Russia, said in a statement earlier this week. “Any sanctions that disrupt access of Russian people to information only strengthen Putin’s regime.”
Mr. Fedorov said it was the only way to jolt the Russian people into action. He praised the work of Ukraine-supporting hackers who have been coordinating loosely with Ukrainian government to hit Russian targets.
“After cruise missiles started flying over my house and over houses of many other Ukrainians, and also things started exploding, we decided to go into counter attack,” he said.
Mr. Fedorov’s work is an example of Ukraine’s whatever-it-takes attitude against a larger Russian army, said Max Chernikov, a software engineer who is supporting the volunteer group known as the IT Army of Ukraine.
“He acts like every Ukrainian — doing beyond his best,” he said.
Mr. Fedorov, who has a wife and young daughter, said he remained hopeful about the war’s outcome.
“The truth is on our side,” he added. “I’m sure we’re going to win.”
Daisuke Wakabayashi and Mike Isaac contributed reporting.
Google said on Wednesday that it was working on privacy measures meant to limit the sharing of data on smartphones running its Android software. But the company promised those changes would not be as disruptive as a similar move by Apple last year.
Apple’s changes to its iOS software on iPhones asked users for permission before allowing advertisers to track them. Apple’s permission controls — and, ultimately, the decision by users to block tracking — have had a profound impact on internet companies that built businesses on so-called targeted advertising.
Google did not provide an exact timeline for its changes, but said it would support existing technologies for at least two more years.
This month, Meta, the company founded as Facebook, said Apple’s privacy changes would cost it $10 billion this year in lost advertising revenue. The revelation weighed on Meta’s stock price and led to concerns about other companies reliant on digital advertising.
revamp its approach to eliminating so-called cookies, a tracking tool, on Chrome while facing resistance from privacy groups and advertisers.
Google said it was proposing some new privacy-minded approaches in Android to allow advertisers to gauge the performance of ad campaigns and show personalized ads based on past behavior or recent interests — as well as new tools to limit covert tracking through apps. Google did not offer much in terms of detail about how these new alternatives would work.
As part of the changes, Google said, it plans to phase out Advertising ID, a tracking feature within Android that helps advertisers know whether users clicked on an ad or bought a product as well as keep tabs on their interests and activities. Google said it already allowed users to opt out of personalized ads by removing the tracking identifier.
The company said it planned to eliminate identifiers used in advertising on Android for everyone — including Google. Mr. Chavez said Google’s own apps would not have special or privileged access to Android data or features without specifying how that would work. This echoes a pledge Google made to regulators in Britain that it would not give preferential treatment to its own products.
The company did not offer a definitive timeline for eliminating Advertising ID, but it committed to keeping the existing system in place for two years. Google said it would offer preview versions of its new proposals to advertisers, before releasing a more complete test version this year.
SAN FRANCISCO — In 2019, Facebook researchers began a new study of one of the social network’s foundational features: the Like button.
They examined what people would do if Facebook removed the distinct thumbs-up icon and other emoji reactions from posts on its photo-sharing app Instagram, according to company documents. The buttons had sometimes caused Instagram’s youngest users “stress and anxiety,” the researchers found, especially if posts didn’t get enough Likes from friends.
But the researchers discovered that when the Like button was hidden, users interacted less with posts and ads. At the same time, it did not alleviate teenagers’ social anxiety and young users did not share more photos, as the company thought they might, leading to a mixed bag of results.
Mark Zuckerberg, Facebook’s chief executive, and other managers discussed hiding the Like button for more Instagram users, according to the documents. In the end, a larger test was rolled out in just a limited capacity to “build a positive press narrative” around Instagram.
misinformation, privacy and hate speech, a central issue has been whether the basic way that the platform works has been at fault — essentially, the features that have made Facebook be Facebook.
Apart from the Like button, Facebook has scrutinized its share button, which lets users instantly spread content posted by other people; its groups feature, which is used to form digital communities; and other tools that define how more than 3.5 billion people behave and interact online. The research, laid out in thousands of pages of internal documents, underlines how the company has repeatedly grappled with what it has created.
What researchers found was often far from positive. Time and again, they determined that people misused key features or that those features amplified toxic content, among other effects. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” — meaning the basics of how the product functioned — that had let misinformation and hate speech flourish on the site.
“The mechanics of our platform are not neutral,” they concluded.
hide posts they do not want to see and turning off political group recommendations to reduce the spread of misinformation.
But the core way that Facebook operates — a network where information can spread rapidly and where people can accumulate friends and followers and Likes — ultimately remains largely unchanged.
Many significant modifications to the social network were blocked in the service of growth and keeping users engaged, some current and former executives said. Facebook is valued at more than $900 billion.
“There’s a gap between the fact that you can have pretty open conversations inside of Facebook as an employee,” said Brian Boland, a Facebook vice president who left last year. “Actually getting change done can be much harder.”
The company documents are part of the Facebook Papers, a cache provided to the Securities and Exchange Commission and to Congress by a lawyer representing Frances Haugen, a former Facebook employee who has become a whistle-blower. Ms. Haugen earlier gave the documents to The Wall Street Journal. This month, a congressional staff member supplied the redacted disclosures to more than a dozen other news organizations, including The New York Times.
In a statement, Andy Stone, a Facebook spokesman, criticized articles based on the documents, saying that they were built on a “false premise.”
“Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie,” he said. He said Facebook had invested $13 billion and hired more than 40,000 people to keep people safe, adding that the company has called “for updated regulations where democratic governments set industry standards to which we can all adhere.”
post this month, Mr. Zuckerberg said it was “deeply illogical” that the company would give priority to harmful content because Facebook’s advertisers don’t want to buy ads on a platform that spreads hate and misinformation.
“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote.
The Foundations of Success
When Mr. Zuckerberg founded Facebook 17 years ago in his Harvard University dorm room, the site’s mission was to connect people on college campuses and bring them into digital groups with common interests and locations.
Growth exploded in 2006 when Facebook introduced the News Feed, a central stream of photos, videos and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.
In 2009, Facebook introduced the Like button. The tiny thumbs-up symbol, a simple indicator of people’s preferences, became one of the social network’s most important features. The company allowed other websites to adopt the Like button so users could share their interests back to their Facebook profiles.
That gave Facebook insight into people’s activities and sentiments outside of its own site, so it could better target them with advertising. Likes also signified what users wanted to see more of in their News Feeds so people would spend more time on Facebook.
Facebook also added the groups feature, where people join private communication channels to talk about specific interests, and pages, which allowed businesses and celebrities to amass large fan bases and broadcast messages to those followers.
Adam Mosseri, the head of Instagram, has said that research on users’ well-being led to investments in anti-bullying measures on Instagram.
Understand the Facebook Papers
Card 1 of 6
A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.
The whistle-blower. During an interview with “60 Minutes” that aired Oct. 3, Frances Haugen, a Facebook product manager who left the company in May, revealed that she was responsible for the leak of those internal documents.
Ms. Haugen’s testimony in Congress. On Oct. 5, Ms. Haugen testified before a Senate subcommittee, saying that Facebook was willing to use hateful and harmful content on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.
The Facebook Papers. Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the Facebook Papers, to several news organizations, including The New York Times.
Yet Facebook cannot simply tweak itself so that it becomes a healthier social network when so many problems trace back to core features, said Jane Lytvynenko, a senior fellow at the Harvard Kennedy Shorenstein Center, who studies social networks and misinformation.
“When we talk about the Like button, the share button, the News Feed and their power, we’re essentially talking about the infrastructure that the network is built on top of,” she said. “The crux of the problem here is the infrastructure itself.”
As Facebook’s researchers dug into how its products worked, the worrisome results piled up.
In a July 2019 study of groups, researchers traced how members in those communities could be targeted with misinformation. The starting point, the researchers said, were people known as “invite whales,” who sent invitations out to others to join a private group.
These people were effective at getting thousands to join new groups so that the communities ballooned almost overnight, the study said. Then the invite whales could spam the groups with posts promoting ethnic violence or other harmful content, according to the study.
Another 2019 report looked at how some people accrued large followings on their Facebook pages, often using posts about cute animals and other innocuous topics. But once a page had grown to tens of thousands of followers, the founders sold it. The buyers then used the pages to show followers misinformation or politically divisive content, according to the study.
As researchers studied the Like button, executives considered hiding the feature on Facebook as well, according to the documents. In September 2019, it removed Likes from users’ Facebook posts in a small experiment in Australia.
The company wanted to see if the change would reduce pressure and social comparison among users. That, in turn, might encourage people to post more frequently to the network.
But people did not share more posts after the Like button was removed. Facebook chose not to roll the test out more broadly, noting, “Like counts are extremely low on the long list of problems we need to solve.”
Last year, company researchers also evaluated the share button. In a September 2020 study, a researcher wrote that the button and so-called reshare aggregation units in the News Feed, which are automatically generated clusters of posts that have already been shared by people’s friends, were “designed to attract attention and encourage engagement.”
But gone unchecked, the features could “serve to amplify bad content and sources,” such as bullying and borderline nudity posts, the researcher said.
That’s because the features made people less hesitant to share posts, videos and messages with one another. In fact, users were three times more likely to share any kind of content from the reshare aggregation units, the researcher said.
One post that spread widely this way was an undated message from an account called “The Angry Patriot.” The post notified users that people protesting police brutality were “targeting a police station” in Portland, Ore. After it was shared through reshare aggregation units, hundreds of hate-filled comments flooded in. It was an example of “hate bait,” the researcher said.
A common thread in the documents was how Facebook employees argued for changes in how the social network worked and often blamed executives for standing in the way.
In an August 2020 internal post, a Facebook researcher criticized the recommendation system that suggests pages and groups for people to follow and said it can “very quickly lead users down the path to conspiracy theories and groups.”
“Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms,” the researcher wrote. “During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole” of conspiracy theory movements like QAnon and anti-vaccination and Covid-19 conspiracies.
The researcher added, “It has been painful to observe.”
Reporting was contributed by Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac.
Facebook knew that an ad intended for a 13-year-old was likely to capture younger children who wanted to mimic their older siblings and friends, one person said. Managers told employees that Facebook did everything it could to stop underage users from joining Instagram, but that it could not be helped if they signed up anyway.
In September 2018, Kevin Systrom and Mike Krieger, Instagram’s founders, left Facebook after clashing with Mr. Zuckerberg. Mr. Mosseri, a longtime Facebook executive, was appointed to helm Instagram.
With the leadership changes, Facebook went all out to turn Instagram into a main attraction for young audiences, four former employees said. That coincided with the realization that Facebook itself, which was grappling with data privacy and other scandals, would never be a teen destination, the people said.
Instagram began concentrating on the “teen time spent” data point, three former employees said. The goal was to drive up the amount of time that teenagers were on the app with features including Instagram Live, a broadcasting tool, and Instagram TV, where people upload videos that run as long as an hour.
Instagram also increased its global marketing budget. In 2018, it allocated $67.2 million to marketing. In 2019, that increased to a planned $127.3 million, then to $186.3 million last year and $390 million this year, according to the internal documents. Most of the budgets were designated to wooing teens, the documents show. Mr. Mosseri approved the budgets, two employees said.
The money was slated for marketing categories like “establishing Instagram as the favorite place for teens to express themselves” and cultural programs for events like the Super Bowl, according to the documents.
Many of the resulting ads were digital, featuring some of the platform’s top influencers, such as Donté Colley, a Canadian dancer and creator. The marketing, when put into action, also targeted parents of teenagers and people up to the age of 34.
The changes have involved Facebook executives from its marketing, communications, policy and integrity teams. Alex Schultz, a 14-year company veteran who was named chief marketing officer last year, has also been influential in the image reshaping effort, said five people who worked with him. But at least one of the decisions was driven by Mr. Zuckerberg, and all were approved by him, three of the people said.
Joe Osborne, a Facebook spokesman, denied that the company had changed its approach.
“People deserve to know the steps we’re taking to address the different issues facing our company — and we’re going to share those steps widely,” he said in a statement.
For years, Facebook executives have chafed at how their company appeared to receive more scrutiny than Google and Twitter, said current and former employees. They attributed that attention to Facebook’s leaving itself more exposed with its apologies and providing access to internal data, the people said.
So in January, executives held a virtual meeting and broached the idea of a more aggressive defense, one attendee said. The group discussed using the News Feed to promote positive news about the company, as well as running ads that linked to favorable articles about Facebook. They also debated how to define a pro-Facebook story, two participants said.
That same month, the communications team discussed ways for executives to be less conciliatory when responding to crises and decided there would be less apologizing, said two people with knowledge of the plan.
Mr. Zuckerberg, who had become intertwined with policy issues including the 2020 election, also wanted to recast himself as an innovator, the people said. In January, the communications team circulated a document with a strategy for distancing Mr. Zuckerberg from scandals, partly by focusing his Facebook posts and media appearances on new products, they said.
The Information, a tech news site, previously reported on the document.
The impact was immediate. On Jan. 11, Sheryl Sandberg, Facebook’s chief operating officer — and not Mr. Zuckerberg — told Reuters that the storming of the U.S. Capitol a week earlier had little to do with Facebook. In July, when President Biden said the social network was “killing people” by spreading Covid-19 misinformation, Guy Rosen, Facebook’s vice president for integrity, disputed the characterization in a blog post and pointed out that the White House had missed its coronavirus vaccination goals.
“The internet is answering a question that it’s been wrestling with for decades, which is: How is the internet going to pay for itself?” he said.
The fallout may hurt brands that relied on targeted ads to get people to buy their goods. It may also initially hurt tech giants like Facebook — but not for long. Instead, businesses that can no longer track people but still need to advertise are likely to spend more with the largest tech platforms, which still have the most data on consumers.
David Cohen, chief executive of the Interactive Advertising Bureau, a trade group, said the changes would continue to “drive money and attention to Google, Facebook, Twitter.”
The shifts are complicated by Google’s and Apple’s opposing views on how much ad tracking should be dialed back. Apple wants its customers, who pay a premium for its iPhones, to have the right to block tracking entirely. But Google executives have suggested that Apple has turned privacy into a privilege for those who can afford its products.
For many people, that means the internet may start looking different depending on the products they use. On Apple gadgets, ads may be only somewhat relevant to a person’s interests, compared with highly targeted promotions inside Google’s web. Website creators may eventually choose sides, so some sites that work well in Google’s browser might not even load in Apple’s browser, said Brendan Eich, a founder of Brave, the private web browser.
“It will be a tale of two internets,” he said.
Businesses that do not keep up with the changes risk getting run over. Increasingly, media publishers and even apps that show the weather are charging subscription fees, in the same way that Netflix levies a monthly fee for video streaming. Some e-commerce sites are considering raising product prices to keep their revenues up.
Consider Seven Sisters Scones, a mail-order pastry shop in Johns Creek, Ga., which relies on Facebook ads to promote its items. Nate Martin, who leads the bakery’s digital marketing, said that after Apple blocked some ad tracking, its digital marketing campaigns on Facebook became less effective. Because Facebook could no longer get as much data on which customers like baked goods, it was harder for the store to find interested buyers online.
Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.
The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.
Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.
Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.
pays them through a trust.
The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.
In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.
A spokesman for the Oversight Board declined to comment.
Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.
bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.
The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.
The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.
Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.
There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.
“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”
Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter.
An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.
Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.”