On the morning of July 8, former President Donald J. Trump took to Truth Social, a social media platform he founded with people close to him, to claim that he had in fact won the 2020 presidential vote in Wisconsin, despite all evidence to the contrary.
Barely 8,000 people shared that missive on Truth Social, a far cry from the hundreds of thousands of responses his posts on Facebook and Twitter had regularly generated before those services suspended his megaphones after the deadly riot on Capitol Hill on Jan. 6, 2021.
And yet Mr. Trump’s baseless claim pulsed through the public consciousness anyway. It jumped from his app to other social media platforms — not to mention podcasts, talk radio or television.
Within 48 hours of Mr. Trump’s post, more than one million people saw his claim on at least dozen other media. It appeared on Facebook and Twitter, from which he has been banished, but also YouTube, Gab, Parler and Telegram, according to an analysis by The New York Times.
gone mainstream among Republican Party members, driving state and county officials to impose new restrictions on casting ballots, often based on mere conspiracy theories percolating in right-wing media.
Voters must now sift through not only an ever-growing torrent of lies and falsehoods about candidates and their policies, but also information on when and where to vote. Officials appointed or elected in the name of fighting voter fraud have put themselves in the position to refuse to certify outcomes that are not to their liking.
a primary battleground in today’s fight against disinformation. A report last month by NewsGuard, an organization that tracks the problem online, showed that nearly 20 percent of videos presented as search results on TikTok contained false or misleading information on topics such as school shootings and Russia’s war in Ukraine.
continued to amplify “election denialism” in ways that undermined trust in the democratic system.
Another challenge is the proliferation of alternative platforms for those falsehoods and even more extreme views.
new survey by the Pew Research Center found that 15 percent of prominent accounts on those seven platforms had previously been banished from others like Twitter and Facebook.
F.B.I. raid on Mar-a-Lago thrust his latest pronouncements into the eye of the political storm once again.
study of Truth Social by Media Matters for America, a left-leaning media monitoring group, examined how the platform had become a home for some of the most fringe conspiracy theories. Mr. Trump, who began posting on the platform in April, has increasingly amplified content from QAnon, the online conspiracy theory.
He has shared posts from QAnon accounts more than 130 times. QAnon believers promote a vast and complex conspiracy that centers on Mr. Trump as a leader battling a cabal of Democratic Party pedophiles. Echoes of such views reverberated through Republican election campaigns across the country during this year’s primaries.
Ms. Jankowicz, the disinformation expert, said the nation’s social and political divisions had churned the waves of disinformation.
The controversies over how best to respond to the Covid-19 pandemic deepened distrust of government and medical experts, especially among conservatives. Mr. Trump’s refusal to accept the outcome of the 2020 election led to, but did not end with, the Capitol Hill violence.
“They should have brought us together,” Ms. Jankowicz said, referring to the pandemic and the riots. “I thought perhaps they could be kind of this convening power, but they were not.”
Ye, the rapper formerly known as Kanye West, has set off one controversy after another in the last week, first at his fashion show and then on social media, prompting accusations of racism and antisemitism.
On Monday, at Paris Fashion Week, he debuted a T-shirt for his fashion line bearing the phrase “White Lives Matter.” On Friday, he suggested on Instagram that Sean Combs, the rapper known as Diddy, was being controlled by Jewish people. Ye’s account was restricted by Instagram that day.
Early on Sunday morning, he went on Twitter and lashed out against Jewish people in a series of tweets.
Ye tweeted that he would soon go “death con 3 On JEWISH PEOPLE,” an apparent reference to the United States’ defense readiness condition, known as Def. Con.
separate tweet, Ye accused Mark Zuckerberg, the chief executive of Meta, which owns Instagram, of removing him from Instagram.
“Who you think created cancel culture?” he added in another tweet.
In a statement, a spokeswoman for Twitter said Ye’s account was locked for violating Twitter’s policies. A spokeswoman for Meta said it places restrictions on accounts that repeatedly break its rules.
Representatives for Ye could not immediately be reached.
The restrictions on Twitter and Instagram mean that Ye’s account is still active, but that the rapper cannot post for an undisclosed period.
Ye had returned to Twitter on Saturday after not posting for nearly two years.
The posts were yet another test of social media companies’ willingness to monitor content that is perceived as hateful.
called “White Lives Matter” a hateful phrase used by white supremacists.
At first, Ye appeared to relish in the T-shirt controversy, writing on Instagram that “my one t-shirt took allllll the attention.”
But outrage continued to build online from several artists, including Mr. Combs, who criticized the design in a video on Instagram.
“Don’t wear the shirt. Don’t buy the shirt. Don’t play with the shirt,” Mr. Combs said. “It’s not a joke.”
On Thursday, Adidas said it would put its partnership with Yeezy “under review.” (Ye ended his partnership with Gap last month.)
On Friday, Ye posted screenshots from a text message exchange with Mr. Combs to his Instagram account, where he suggested that Mr. Combs was being controlled by Jewish people. The comments were called antisemitic by several Jewish groups.
buy the social media company for $44 billion and could loosen its content moderation policies, replied to the tweet.
“Welcome back to Twitter, my friend!” Mr. Musk wrote.
In January 2019, Mr. Russell went public with Molly’s story. Outraged that his young daughter could view such bleak content so easily and convinced that it had played a role in her death, he sat for a TV interview with the BBC that resulted in front-page stories across British newsstands.
Mr. Russell, a television director, urged the coroner reviewing Molly’s case to go beyond what is often a formulaic process, and to explore the role of social media. Mr. Walker agreed after seeing a sample of Molly’s social media history.
That resulted in a yearslong effort to get access to Molly’s social media data. The family did not know her iPhone passcode, but the London police were able to bypass it to extract 30,000 pages of material. After a lengthy battle, Meta agreed to provide more than 16,000 pages from her Instagram, such a volume that it delayed the start of the inquest. Merry Varney, a lawyer with the Leigh Day law firm who worked on the case through a legal aid program, said it had taken more than 1,000 hours to review the content.
What they found was that Molly had lived something of a double life. While she was a regular teenager to family, friends and teachers, her existence online was much bleaker.
In the six months before Molly died, she shared, liked or saved 16,300 pieces of content on Instagram. About 2,100 of those posts, or about 12 per day, were related to suicide, self-harm and depression, according to data that Meta disclosed to her family. Many accounts she interacted with were dedicated to sharing only depressive and suicidal material, often using hashtags that linked to other explicit content.
Many posts glorified inner struggle, hiding emotional duress and telling others “I’m fine.” Molly went on binges of liking and saving graphic depictions of suicide and self-harm, once after 3 a.m., according to a timeline of her Instagram usage.
SAN FRANCISCO — Mark Zuckerberg, the founder and chief executive of the company formerly known as Facebook, called his top lieutenants for the social network to a last-minute meeting in the San Francisco Bay Area this month. On the agenda: a “work-athon” to discuss the road map for improving the main Facebook app, including a revamp that would change how users browse the service.
For weeks beforehand, Mr. Zuckerberg had sent his executives messages about the overhaul, pressing them to increase the velocity and execution of their work, people with knowledge of the matter said. Some executives — who had to read a 122-page slide deck about the changes — were beginning to sweat at the unusual level of intensity, they said.
Facebook’s leaders flew in from around the world for the summit, the people said, and Mr. Zuckerberg and the group pored over each slide. Within days, the team unveiled an update to the Facebook app to better compete with a top rival, TikTok.
trimmed perks, reshuffled his leadership team and made it clear he would cut low-performing employees. Those who are not on board are welcome to leave, he has said. Managers have sent out memos to convey the seriousness of the approach — one, which was shared with The New York Times, had the title “Operating With Increased Intensity.”
the so-called metaverse. Across Silicon Valley, he and other executives who built what many refer to as Web 2.0 — a more social, app-focused version of the internet — are rethinking and upending their original vision after their platforms were plagued by privacy stumbles, toxic content and misinformation.
The moment is reminiscent of other bet-the-company gambles, such as when Netflix killed off its DVD-mailing business last decade to focus on streaming. But Mr. Zuckerberg is making these moves as Meta’s back is against the wall. The company is staring into the barrel of a global recession. Competitors like TikTok, YouTube and Apple are bearing down.
And success is far from guaranteed. In recent months, Meta’s profits have fallen and revenue has slowed as the company has spent lavishly on the metaverse and as the economic slowdown has hurt its advertising business. Its stock has plunged.
“When Mark gets super focused on something, it becomes all hands on deck within the company,” said Katie Harbath, a former Facebook policy director and the founder of Anchor Change, a consulting firm that works on tech and democracy issues. “Teams will quickly drop other work to pivot to the issue at hand, and the pressure is intense to move fast to show progress.”
Andrew Bosworth, who is known as Boz, to chief technology officer, leading hardware efforts for the metaverse. He promoted other loyalists, too, including Javier Olivan, the new chief operating officer; Nick Clegg, who became president of global affairs; and Guy Rosen, who took on a new role of chief information security officer.
In June, Sheryl Sandberg, who was Mr. Zuckerberg’s No. 2 for 14 years, said she would step down this fall. While she spent more than a decade building Facebook’s advertising systems, she was less interested in doing the same for the metaverse, people familiar with her plans have said.
Mr. Zuckerberg has moved thousands of workers into different teams for the metaverse, training their focus on aspirational projects like hardware glasses, wearables and a new operating system for those devices.
“It’s an existential bet on where people over the next decade will connect, express and identify with one another,” said Matthew Ball, a longtime tech executive and the author of a book on the metaverse. “If you have the cash, the engineers, the users and the conviction to take a swing at that, then you should.”
But the efforts are far from cheap. Facebook’s Reality Labs division, which is building augmented and virtual reality products, has dragged down the company’s balance sheet; the hardware unit lost nearly $3 billion in the first quarter alone.
privacy changes from Apple that have hampered its ability to measure the effectiveness of ads on iPhones. TikTok, the Chinese-owned video app, has stolen young audiences from Meta’s core apps like Instagram and Facebook. These challenges are coinciding with a brutal macroeconomic environment, which has pushed Apple, Google, Microsoft and Twitter to freeze or slow hiring.
a memo last month, Chris Cox, Meta’s chief product officer, said the economic environment called for “leaner, meaner, better executing teams.”
In an employee meeting around the same time, Mr. Zuckerberg said he knew that not everyone would be on board for the changes. That was fine, he told employees.
“I think some of you might decide that this place isn’t for you, and that self-selection is OK with me,” Mr. Zuckerberg said. “Realistically, there are probably a bunch of people at the company who shouldn’t be here.”
Another memo circulated internally among workers this month was titled “Operating With Increased Intensity.” In the memo, a Meta vice president said managers should begin to “think about every person on their team and the value they are adding.”
“If a direct report is coasting or a low performer, they are not who we need; they are failing this company,” the memo said. “As a manager, you cannot allow someone to be net neutral or negative for Meta.”
investment priorities” for the company in the second half of this year.
other prototypes. Bloomberg reported earlier on the smart watch.
posted an update to his Facebook profile, noting some coming changes in the app. Facebook would start pushing people into a more video-heavy feed with more suggested content, emulating how TikTok operates.
Meta has been investing heavily in video and discovery, aiming to beef up its artificial intelligence and to improve “discovery algorithms” that suggest engaging content to users without them having to work to find it.
In the past, Facebook has tested major product updates with a few English-speaking audiences to see how they perform before rolling them out more widely. But, this time, the 2.93 billion people around the world who use the social networking app will receive the update simultaneously.
It is a sign, some Meta employees said, of just how much Mr. Zuckerberg means business.
Mark Zuckerberg, Facebook’s chief executive, made securing the 2020 U.S. election a top priority. He met regularly with an election team, which included more than 300 people from across his company, to prevent misinformation from spreading on the social network. He asked civil rights leaders for advice on upholding voter rights.
The core election team at Facebook, which was renamed Meta last year, has since been dispersed. Roughly 60 people are now focused primarily on elections, while others split their time on other projects. They meet with another executive, not Mr. Zuckerberg. And the chief executive has not talked recently with civil rights groups, even as some have asked him to pay more attention to the midterm elections in November.
Safeguarding elections is no longer Mr. Zuckerberg’s top concern, said four Meta employees with knowledge of the situation. Instead, he is focused on transforming his company into a provider of the immersive world of the metaverse, which he sees as the next frontier of growth, said the people, who were not authorized to speak publicly.
hearings on the Jan. 6 Capitol riot have underlined how precarious elections can be. And dozens of political candidates are running this November on the false premise that former President Donald J. Trump was robbed of the 2020 election, with social media platforms continuing to be a key way to reach American voters.
2000 Mules,” a film that falsely claims the 2020 election was stolen from Mr. Trump, was widely shared on Facebook and Instagram, garnering more than 430,000 interactions, according to an analysis by The New York Times. In posts about the film, commenters said they expected election fraud this year and warned against using mail-in voting and electronic voting machines.
$44 billion sale to Elon Musk, three employees with knowledge of the situation said. Mr. Musk has suggested that he wants fewer rules about what can and cannot be posted on the service.
barred Mr. Trump from its platforms after the riot at the U.S. Capitol on Jan. 6, 2021, has worked over the years to limit political falsehoods on its sites. Tom Reynolds, a Meta spokesman, said the company had “taken a comprehensive approach to how elections play out on our platforms since before the U.S. 2020 elections and through the dozens of global elections since then.”
recently raised doubts about the country’s electoral process. Latvia, Bosnia and Slovenia are also holding elections in October.
“People in the U.S. are almost certainly getting the Rolls-Royce treatment when it comes to any integrity on any platform, especially for U.S. elections,” said Sahar Massachi, the executive director of the think tank Integrity Institute and a former Facebook employee. “And so however bad it is here, think about how much worse it is everywhere else.”
Facebook’s role in potentially distorting elections became evident after 2016, when Russian operatives used the site to spread inflammatory content and divide American voters in the U.S. presidential election. In 2018, Mr. Zuckerberg testified before Congress that election security was his top priority.
banning QAnon conspiracy theory posts and groups in October 2020.
Around the same time, Mr. Zuckerberg and his wife, Priscilla Chan, donated $400 million to local governments to fund poll workers, pay for rental fees for polling places, provide personal protective equipment and cover other administrative costs.
The week before the November 2020 election, Meta also froze all political advertising to limit the spread of falsehoods.
But while there were successes — the company kept foreign election interference off the platform — it struggled with how to handle Mr. Trump, who used his Facebook account to amplify false claims of voter fraud. After the Jan. 6 riot, Facebook barred Mr. Trump from posting. He is eligible for reinstatement in January.
Frances Haugen, a Facebook employee turned whistle-blower, filed complaints with the Securities and Exchange Commission accusing the company of removing election safety features too soon after the 2020 election. Facebook made growth and engagement its priorities over security, she said.
fully realized digital world that exists beyond the one in which we live. It was coined by Neal Stephenson in his 1992 novel “Snow Crash,” and the concept was further explored by Ernest Cline in his novel “Ready Player One.”
The future. Many people in tech believe the metaverse will herald an era in which our virtual lives will play as important a role as our physical realities. Some experts warn that it could still turn out to be a fad or even dangerous.
Mr. Zuckerberg no longer meets weekly with those focused on election security, said the four employees, though he receives their reports. Instead, they meet with Nick Clegg, Meta’s president of global affairs.
Several civil right groups said they had noticed Meta’s shift in priorities. Mr. Zuckerberg isn’t involved in discussions with them as he once was, nor are other top Meta executives, they said.
“I’m concerned,” said Derrick Johnson, president of the National Association for the Advancement of Colored People, who talked with Mr. Zuckerberg and Sheryl Sandberg, Meta’s chief operating officer, ahead of the 2020 election. “It appears to be out of sight, out of mind.” (Ms. Sandberg has announced that she will leave Meta this fall.)
wrote a letter to Mr. Zuckerberg and the chief executives of YouTube, Twitter, Snap and other platforms. They called for them to take down posts about the lie that Mr. Trump won the 2020 election and to slow the spread of election misinformation before the midterms.
Yosef Getachew, a director at the nonprofit public advocacy organization Common Cause, whose group studied 2020 election misinformation on social media, said the companies had not responded.
“The Big Lie is front and center in the midterms with so many candidates using it to pre-emptively declare that the 2022 election will be stolen,” he said, pointing to recent tweets from politicians in Michigan and Arizona who falsely said dead people cast votes for Democrats. “Now is not the time to stop enforcing against the Big Lie.”
While Meta adjusts, some small businesses have begun seeking other avenues for ads. Shawn Baker, the owner of Baker SoftWash, an exterior cleaning company in Mooresville, N.C., said it previously took about $6 of Facebook ads to identify a new customer. Now it costs $27 because the ads do not find the right people, he said.
Mr. Baker has started spending $200 a month to advertise through Google’s marketing program for local businesses, which surfaces his website when people who live in the area search for cleaners. To compensate for those higher marketing costs, he has raised his prices 7 percent.
“You’re spending more money now than what you had to spend before to do the same things,” he said.
Other tech giants with first-party information are capitalizing on the change. Amazon, for example, has reams of data on its customers, including what they buy, where they reside, and what movies or TV shows they stream.
In February, Amazon disclosed the size of its advertising business — $31.2 billion in revenue in 2021 — for the first time. That makes advertising its third-largest source of sales after e-commerce and cloud computing. Amazon declined to comment.
Amber Murray, the owner of See Your Strength in St. George, Utah, which sells stickers online for people with anxiety, started experimenting with ads on Amazon after the performance of Facebook ads deteriorated. The results were remarkable, she said.
In February, she paid about $200 for Amazon to feature her products near the top of search results when customers looked up textured stickers. Sales totaled $250 a day and continued to grow, she said. When she spent $85 on a Facebook ad campaign in January, it yielded just $37.50 in sales, she said.
“I think the golden days of Facebook advertising are over,” Ms. Murray said. “On Amazon, people are looking for you, instead of you telling people what they should want.”
Meta, which owns Facebook and Instagram, took an unusual step last week: It suspended some of the quality controls that ensure that posts from users in Russia, Ukraine and other Eastern European countries meet its rules.
Under the change, Meta temporarily stopped tracking whether its workers who monitor Facebook and Instagram posts from those areas were accurately enforcing its content guidelines, six people with knowledge of the situation said. That’s because the workers could not keep up with shifting rules about what kinds of posts were allowed about the war in Ukraine, they said.
Meta has made more than half a dozen content policy revisions since Russia invaded Ukraine last month. The company has permitted posts about the conflict that it would normally have taken down — including some calling for the death of President Vladimir V. Putin of Russia and violence against Russian soldiers — before changing its mind or drawing up new guidelines, the people said.
The result has been internal confusion, especially among the content moderators who patrol Facebook and Instagram for text and images with gore, hate speech and incitements to violence. Meta has sometimes shifted its rules on a daily basis, causing whiplash, said the people, who were not authorized to speak publicly.
contended with pressure from Russian and Ukrainian authorities over the information battle about the conflict. And internally, it has dealt with discontent about its decisions, including from Russian employees concerned for their safety and Ukrainian workers who want the company to be tougher on Kremlin-affiliated organizations online, three people said.
Meta has weathered international strife before — including the genocide of a Muslim minority in Myanmar last decade and skirmishes between India and Pakistan — with varying degrees of success. Now the largest conflict on the European continent since World War II has become a litmus test of whether the company has learned to police its platforms during major global crises — and so far, it appears to remain a work in progress.
“All the ingredients of the Russia-Ukraine conflict have been around for a long time: the calls for violence, the disinformation, the propaganda from state media,” said David Kaye, a law professor at the University of California, Irvine, and a former special rapporteur to the United Nations. “What I find mystifying was that they didn’t have a game plan to deal with it.”
Dani Lever, a Meta spokeswoman, declined to directly address how the company was handling content decisions and employee concerns during the war.
After Russia invaded Ukraine, Meta said it established a round-the-clock special operations team staffed by employees who are native Russian and Ukrainian speakers. It also updated its products to aid civilians in the war, including features that direct Ukrainians toward reliable, verified information to locate housing and refugee assistance.
Mark Zuckerberg, Meta’s chief executive, and Sheryl Sandberg, the chief operating officer, have been directly involved in the response to the war, said two people with knowledge of the efforts. But as Mr. Zuckerberg focuses on transforming Meta into a company that will lead the digital worlds of the so-called metaverse, many responsibilities around the conflict have fallen — at least publicly — to Nick Clegg, the president for global affairs.
announced that Meta would restrict access within the European Union to the pages of Russia Today and Sputnik, which are Russian state-controlled media, following requests by Ukraine and other European governments. Russia retaliated by cutting off access to Facebook inside the country, claiming the company discriminated against Russian media, and then blocking Instagram.
This month, President Volodymyr Zelensky of Ukraine praised Meta for moving quickly to limit Russian war propaganda on its platforms. Meta also acted rapidly to remove an edited “deepfake” video from its platforms that falsely featured Mr. Zelensky yielding to Russian forces.
a group called the Ukrainian Legion to run ads on its platforms this month to recruit “foreigners” for the Ukrainian army, a violation of international laws. It later removed the ads — which were shown to people in the United States, Ireland, Germany and elsewhere — because the group may have misrepresented ties to the Ukrainian government, according to Meta.
Internally, Meta had also started changing its content policies to deal with the fast-moving nature of posts about the war. The company has long forbidden posts that might incite violence. But on Feb. 26, two days after Russia invaded Ukraine, Meta informed its content moderators — who are typically contractors — that it would allow calls for the death of Mr. Putin and “calls for violence against Russians and Russian soldiers in the context of the Ukraine invasion,” according to the policy changes, which were reviewed by The New York Times.
Reuters reported on Meta’s shifts with a headline that suggested that posts calling for violence against all Russians would be tolerated. In response, Russian authorities labeled Meta’s activities as “extremist.”
Shortly thereafter, Meta reversed course and said it would not let its users call for the deaths of heads of state.
“Circumstances in Ukraine are fast moving,” Mr. Clegg wrote in an internal memo that was reviewed by The Times and first reported by Bloomberg. “We try to think through all the consequences, and we keep our guidance under constant review because the context is always evolving.”
Meta amended other policies. This month, it made a temporary exception to its hate speech guidelines so users could post about the “removal of Russians” and “explicit exclusion against Russians” in 12 Eastern European countries, according to internal documents. But within a week, Meta tweaked the rule to note that it should be applied only to users in Ukraine.
The constant adjustments left moderators who oversee users in Central and Eastern European countries confused, the six people with knowledge of the situation said.
Russia-Ukraine War: Key Developments
Card 1 of 3
Putin’s advisers. U.S. intelligence suggests that President Vladimir V. Putin has been misinformed by his advisers about the Russian military’s struggles in Ukraine. The intelligence shows what appears to be growing tension between Mr. Putin and the Ministry of Defense, U.S. officials said.
The policy changes were onerous because moderators were generally given less than 90 seconds to decide on whether images of dead bodies, videos of limbs being blown off, or outright calls to violence violated Meta’s rules, they said. In some instances, they added, moderators were shown posts about the war in Chechen, Kazakh or Kyrgyz, despite not knowing those languages.
Ms. Lever declined to comment on whether Meta had hired content moderators who specialize in those languages.
take action against Russia Today and Sputnik, said two people who attended. Russian state activity was at the center of Facebook’s failure to protect the 2016 U.S. presidential election, they said, and it didn’t make sense that those outlets had continued to operate on Meta’s platforms.
While Meta has no employees in Russia, the company held a separate meeting this month for workers with Russian connections. Those employees said they were concerned that Moscow’s actions against the company would affect them, according to an internal document.
In discussions on Meta’s internal forums, which were viewed by The Times, some Russian employees said they had erased their place of work from their online profiles. Others wondered what would happen if they worked in the company’s offices in places with extradition treaties to Russia and “what kind of risks will be associated with working at Meta not just for us but our families.”
Ms. Lever said Meta’s “hearts go out to all of our employees who are affected by the war in Ukraine, and our teams are working to make sure they and their families have the support they need.”
At a separate company meeting this month, some employees voiced unhappiness with the changes to the speech policies during the war, according to an internal poll. Some asked if the new rules were necessary, calling the changes “a slippery slope” that were “being used as proof that Westerners hate Russians.”
Others asked about the effect on Meta’s business. “Will Russian ban affect our revenue for the quarter? Future quarters?” read one question. “What’s our recovery strategy?”
Mr. Zuckerberg has since turned to Mr. Bosworth for major initiatives. In 2012, Mr. Bosworth was given the task of building out Facebook’s mobile advertising products. After management issues at the Oculus virtual reality division, Mr. Zuckerberg dispatched Mr. Bosworth in August 2017 to take over the initiative. The virtual reality business was later rebranded Reality Labs.
In October, the company said it would create 10,000 metaverse-related jobs in the European Union over the next five years. That same month, Mr. Zuckerberg announced he was changing Facebook’s name to Meta and pledged billions of dollars to the effort.
Reality Labs is now at the forefront of the company’s shift to the metaverse, employees said. Workers in products, engineering and research have been encouraged to apply to new roles there, they said, while others have been elevated from their jobs in social networking divisions to lead the same functions with a metaverse emphasis.
Of the more than 3,000 open jobs listed on Meta’s website, more than 24 percent are now for roles in augmented or virtual reality. The jobs are in cities including Seattle, Shanghai and Zurich. One job listing for a “gameplay engineering manager” for Horizon, the company’s free virtual reality game, said the candidate’s responsibilities would include imagining new ways to experience concerts and conventions.
Internal recruitment for the metaverse ramped up late last year, three Meta engineers said, with their managers mentioning job openings on metaverse-related teams in December and January. Others who didn’t get on board with the new mission left. One former employee said he resigned after feeling like his work on Instagram would no longer be of value to the company; another said they did not think Meta was best placed for creating the metaverse and was searching for a job at a competitor.
What Is the Metaverse, and Why Does It Matter?
Card 1 of 5
The origins. The word “metaverse” describes a fully realized digital world that exists beyond the one in which we live. It was coined by Neal Stephenson in his 1992 novel “Snow Crash,” and the concept was further explored by Ernest Cline in his novel “Ready Player One.”
The future. Many people in tech believe the metaverse will herald an era in which our virtual lives will play as important a role as our physical realities. Some experts warn that it could still turn out to be a fad or even dangerous.
Meta also lured away dozens of employees from companies like Microsoft and Apple, two people with knowledge of the moves said. In particular, Meta hired from those companies’ divisions that worked on augmented reality products, like Microsoft’s Hololens and Apple’s secretive augmented reality glasses project.
Representatives for Microsoft and Apple declined to comment. Bloomberg and The Wall Street Journal previously reported on some of the personnel moves.
“More people getting into social audio is good for social audio,” Maya Watson, Clubhouse’s head of global marketing, said in an interview. “We’re not bothered by it, and, if anything, it makes us feel confident in where we’re going.”
At the start of the year, Clubhouse was booming. In February, the app was downloaded 9.6 million times, Sensor Tower said. A spokeswoman for Clubhouse disputed the accuracy of Sensor Tower’s metrics, which estimate user behavior, but said the company would not provide internal figures.
The app caught the attention of audio creators like Brian McCullough, who hosts a podcast for the news aggregator Techmeme, called “Techmeme Ride Home.” “I remember having conversations that were the best social media has been in 10 years,” Mr. McCullough said of his early days on Clubhouse.
Through the app, he connected with Chris Messina, who leads West Coast business development for Republic, a platform that allows companies to raise capital and unaccredited investors to invest in start-ups. Mr. Messina made a habit of recording snippets of Mr. McCullough’s show and playing them in Clubhouse so he could respond to them, and the pair decided to start making the podcast together.
But in March, Clubhouse experienced a slump as downloads slipped to 2.7 million, and in April the app was downloaded just 917,000 times, Sensor Tower said.
At the same time, Twitter was aggressively expanding Spaces. It began testing the feature in October 2020 and granted access to a broader swath of users in the spring. At the time, the development of Spaces was the top consumer product priority at the company, said a person familiar with the company’s plans who was not permitted to speak publicly about them.
That work appeared to pay off. By May, Spaces had more than one million users, that person said. The Washington Post previously reported the figure.
SAN FRANCISCO — In 2019, Facebook researchers began a new study of one of the social network’s foundational features: the Like button.
They examined what people would do if Facebook removed the distinct thumbs-up icon and other emoji reactions from posts on its photo-sharing app Instagram, according to company documents. The buttons had sometimes caused Instagram’s youngest users “stress and anxiety,” the researchers found, especially if posts didn’t get enough Likes from friends.
But the researchers discovered that when the Like button was hidden, users interacted less with posts and ads. At the same time, it did not alleviate teenagers’ social anxiety and young users did not share more photos, as the company thought they might, leading to a mixed bag of results.
Mark Zuckerberg, Facebook’s chief executive, and other managers discussed hiding the Like button for more Instagram users, according to the documents. In the end, a larger test was rolled out in just a limited capacity to “build a positive press narrative” around Instagram.
misinformation, privacy and hate speech, a central issue has been whether the basic way that the platform works has been at fault — essentially, the features that have made Facebook be Facebook.
Apart from the Like button, Facebook has scrutinized its share button, which lets users instantly spread content posted by other people; its groups feature, which is used to form digital communities; and other tools that define how more than 3.5 billion people behave and interact online. The research, laid out in thousands of pages of internal documents, underlines how the company has repeatedly grappled with what it has created.
What researchers found was often far from positive. Time and again, they determined that people misused key features or that those features amplified toxic content, among other effects. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” — meaning the basics of how the product functioned — that had let misinformation and hate speech flourish on the site.
“The mechanics of our platform are not neutral,” they concluded.
hide posts they do not want to see and turning off political group recommendations to reduce the spread of misinformation.
But the core way that Facebook operates — a network where information can spread rapidly and where people can accumulate friends and followers and Likes — ultimately remains largely unchanged.
Many significant modifications to the social network were blocked in the service of growth and keeping users engaged, some current and former executives said. Facebook is valued at more than $900 billion.
“There’s a gap between the fact that you can have pretty open conversations inside of Facebook as an employee,” said Brian Boland, a Facebook vice president who left last year. “Actually getting change done can be much harder.”
The company documents are part of the Facebook Papers, a cache provided to the Securities and Exchange Commission and to Congress by a lawyer representing Frances Haugen, a former Facebook employee who has become a whistle-blower. Ms. Haugen earlier gave the documents to The Wall Street Journal. This month, a congressional staff member supplied the redacted disclosures to more than a dozen other news organizations, including The New York Times.
In a statement, Andy Stone, a Facebook spokesman, criticized articles based on the documents, saying that they were built on a “false premise.”
“Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie,” he said. He said Facebook had invested $13 billion and hired more than 40,000 people to keep people safe, adding that the company has called “for updated regulations where democratic governments set industry standards to which we can all adhere.”
post this month, Mr. Zuckerberg said it was “deeply illogical” that the company would give priority to harmful content because Facebook’s advertisers don’t want to buy ads on a platform that spreads hate and misinformation.
“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote.
The Foundations of Success
When Mr. Zuckerberg founded Facebook 17 years ago in his Harvard University dorm room, the site’s mission was to connect people on college campuses and bring them into digital groups with common interests and locations.
Growth exploded in 2006 when Facebook introduced the News Feed, a central stream of photos, videos and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.
In 2009, Facebook introduced the Like button. The tiny thumbs-up symbol, a simple indicator of people’s preferences, became one of the social network’s most important features. The company allowed other websites to adopt the Like button so users could share their interests back to their Facebook profiles.
That gave Facebook insight into people’s activities and sentiments outside of its own site, so it could better target them with advertising. Likes also signified what users wanted to see more of in their News Feeds so people would spend more time on Facebook.
Facebook also added the groups feature, where people join private communication channels to talk about specific interests, and pages, which allowed businesses and celebrities to amass large fan bases and broadcast messages to those followers.
Adam Mosseri, the head of Instagram, has said that research on users’ well-being led to investments in anti-bullying measures on Instagram.
Understand the Facebook Papers
Card 1 of 6
A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.
The whistle-blower. During an interview with “60 Minutes” that aired Oct. 3, Frances Haugen, a Facebook product manager who left the company in May, revealed that she was responsible for the leak of those internal documents.
Ms. Haugen’s testimony in Congress. On Oct. 5, Ms. Haugen testified before a Senate subcommittee, saying that Facebook was willing to use hateful and harmful content on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.
The Facebook Papers. Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the Facebook Papers, to several news organizations, including The New York Times.
Yet Facebook cannot simply tweak itself so that it becomes a healthier social network when so many problems trace back to core features, said Jane Lytvynenko, a senior fellow at the Harvard Kennedy Shorenstein Center, who studies social networks and misinformation.
“When we talk about the Like button, the share button, the News Feed and their power, we’re essentially talking about the infrastructure that the network is built on top of,” she said. “The crux of the problem here is the infrastructure itself.”
As Facebook’s researchers dug into how its products worked, the worrisome results piled up.
In a July 2019 study of groups, researchers traced how members in those communities could be targeted with misinformation. The starting point, the researchers said, were people known as “invite whales,” who sent invitations out to others to join a private group.
These people were effective at getting thousands to join new groups so that the communities ballooned almost overnight, the study said. Then the invite whales could spam the groups with posts promoting ethnic violence or other harmful content, according to the study.
Another 2019 report looked at how some people accrued large followings on their Facebook pages, often using posts about cute animals and other innocuous topics. But once a page had grown to tens of thousands of followers, the founders sold it. The buyers then used the pages to show followers misinformation or politically divisive content, according to the study.
As researchers studied the Like button, executives considered hiding the feature on Facebook as well, according to the documents. In September 2019, it removed Likes from users’ Facebook posts in a small experiment in Australia.
The company wanted to see if the change would reduce pressure and social comparison among users. That, in turn, might encourage people to post more frequently to the network.
But people did not share more posts after the Like button was removed. Facebook chose not to roll the test out more broadly, noting, “Like counts are extremely low on the long list of problems we need to solve.”
Last year, company researchers also evaluated the share button. In a September 2020 study, a researcher wrote that the button and so-called reshare aggregation units in the News Feed, which are automatically generated clusters of posts that have already been shared by people’s friends, were “designed to attract attention and encourage engagement.”
But gone unchecked, the features could “serve to amplify bad content and sources,” such as bullying and borderline nudity posts, the researcher said.
That’s because the features made people less hesitant to share posts, videos and messages with one another. In fact, users were three times more likely to share any kind of content from the reshare aggregation units, the researcher said.
One post that spread widely this way was an undated message from an account called “The Angry Patriot.” The post notified users that people protesting police brutality were “targeting a police station” in Portland, Ore. After it was shared through reshare aggregation units, hundreds of hate-filled comments flooded in. It was an example of “hate bait,” the researcher said.
A common thread in the documents was how Facebook employees argued for changes in how the social network worked and often blamed executives for standing in the way.
In an August 2020 internal post, a Facebook researcher criticized the recommendation system that suggests pages and groups for people to follow and said it can “very quickly lead users down the path to conspiracy theories and groups.”
“Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms,” the researcher wrote. “During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole” of conspiracy theory movements like QAnon and anti-vaccination and Covid-19 conspiracies.
The researcher added, “It has been painful to observe.”
Reporting was contributed by Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac.