said in April after sealing the deal. “I don’t care about the economics at all.”

He cared a little more when the subsequent plunge in the stock market meant that he was overpaying by a significant amount. Analysts estimated that Twitter was worth not $44 billion but $30 billion, or maybe even less. For a few months, Mr. Musk tried to get out of the deal.

This had the paradoxical effect of bringing the transaction down to earth for spectators. Who among us has not failed to do due diligence on a new venture — a job, a house, even a relationship — and then realized that it was going to cost so much more than we had thought? Mr. Musk’s buying Twitter, and then his refusal to buy Twitter, and then his being forced to buy Twitter after all — and everything playing out on Twitter — was weirdly relatable.

Inescapable, too. The apex, or perhaps the nadir, came this month when Mr. Musk introduced a perfume called Burnt Hair, described on its website as “the Essence of Repugnant Desire.”

“Please buy my perfume, so I can buy Twitter,” Mr. Musk tweeted on Oct. 12, garnering nearly 600,000 likes. This worked, apparently; the perfume is now marked “sold out” on its site. Did 30,000 people really pay $100 each for a bottle? Will this perfume actually be produced and sold? (It’s not supposed to be released until next year.) It’s hard to tell where the joke stops, which is perhaps the point.

Evan Spiegel.

“What was unique about Twitter was that no one actually controlled it,” said Richard Greenfield, a media analyst at LightShed Partners. “And now one person will own it in its entirety.”

He is relatively hopeful, however, that Mr. Musk will improve the site, somehow. That, in turn, will have its own consequences.

“If it turns into a massive home run,” Mr. Greenfield said, “you’ll see other billionaires try to do the same thing.”

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Kanye West’s Antisemitic Posts Land Him in Trouble on Instagram and Twitter

Ye, the rapper formerly known as Kanye West, has set off one controversy after another in the last week, first at his fashion show and then on social media, prompting accusations of racism and antisemitism.

On Monday, at Paris Fashion Week, he debuted a T-shirt for his fashion line bearing the phrase “White Lives Matter.” On Friday, he suggested on Instagram that Sean Combs, the rapper known as Diddy, was being controlled by Jewish people. Ye’s account was restricted by Instagram that day.

Early on Sunday morning, he went on Twitter and lashed out against Jewish people in a series of tweets.

Ye tweeted that he would soon go “death con 3 On JEWISH PEOPLE,” an apparent reference to the United States’ defense readiness condition, known as Def. Con.

separate tweet, Ye accused Mark Zuckerberg, the chief executive of Meta, which owns Instagram, of removing him from Instagram.

“Who you think created cancel culture?” he added in another tweet.

In a statement, a spokeswoman for Twitter said Ye’s account was locked for violating Twitter’s policies. A spokeswoman for Meta said it places restrictions on accounts that repeatedly break its rules.

Representatives for Ye could not immediately be reached.

The restrictions on Twitter and Instagram mean that Ye’s account is still active, but that the rapper cannot post for an undisclosed period.

Ye had returned to Twitter on Saturday after not posting for nearly two years.

The posts were yet another test of social media companies’ willingness to monitor content that is perceived as hateful.

called “White Lives Matter” a hateful phrase used by white supremacists.

At first, Ye appeared to relish in the T-shirt controversy, writing on Instagram that “my one t-shirt took allllll the attention.”

But outrage continued to build online from several artists, including Mr. Combs, who criticized the design in a video on Instagram.

“Don’t wear the shirt. Don’t buy the shirt. Don’t play with the shirt,” Mr. Combs said. “It’s not a joke.”

On Thursday, Adidas said it would put its partnership with Yeezy “under review.” (Ye ended his partnership with Gap last month.)

On Friday, Ye posted screenshots from a text message exchange with Mr. Combs to his Instagram account, where he suggested that Mr. Combs was being controlled by Jewish people. The comments were called antisemitic by several Jewish groups.

buy the social media company for $44 billion and could loosen its content moderation policies, replied to the tweet.

“Welcome back to Twitter, my friend!” Mr. Musk wrote.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook Parent Meta Posts First Revenue Decline In History

By Associated Press
July 27, 2022

The company is down 36% from $10.39 billion, or $3.61 per share, in the same period a year ago.

Facebook and Instagram’s parent company Meta posted its first revenue decline in history Thursday, dragged by a drop in ad spending as the economy falters — and as competition from rival TikTok intensifies.

The company’s stock dropped slightly in after-hours trading following the results, suggesting Wall Street was largely expecting the weak earnings report.

The results also largely followed a broader decline in the digital advertising market that is dinging rivals such as Alphabet and Snap. Google’s parent company reported its slowest quarterly growth in two years on Tuesday.

Meta also faces some unique challenges, including the looming departure of its chief operating officer Sheryl Sandberg, the chief architect of the company’s massive advertising business.

In addition to TikTok, the decline in ad spending among the downturn and Apple’s privacy changes, “questions about Meta’s leadership” — including Sandberg’s exit and negative sentiment about the company as a whole — also contributed to the decline, said Raj Shah, a managing partner at digital consultancy Publicis Sapient.

Meta earned profits of $6.69 billion, or $2.46 per share, in the April-June period. That’s down 36% from $10.39 billion, or $3.61 per share, in the same period a year ago.

Revenue was $28.82 billion, down 1% from $29.08 billion a year earlier.

Analysts, on average, were expecting earnings of $2.54 per share on revenue of $28.91 billion, according to a poll by FactSet.

“The year-over-year drop in quarterly revenue signifies just how quickly Meta’s business has deteriorated,” said Insider Intelligence analyst Debra Aho Williamson in an email. “Prior to these results, we had forecasted that Meta’s worldwide ad revenue would increase 12.4% this year, to nearly $130 billion. Now, it’s unlikely to reach that figure.”

She added that the good news — if it could be called that — is that Meta’s competitors are also experiencing slowdowns.

Meta is in the midst of a corporate transformation that will take years to complete. It wants to evolve from social media to the “metaverse” — a risky bet that’s still in its nascent stage. The metaverse is sort of the internet brought to life, or at least rendered in 3D. CEO Mark Zuckerberg has described it as a “virtual environment” in which you can immerse yourself instead of just staring at a screen. The company is investing billions on its metaverse plans that will likely take years to pay off — and as part of its plan renamed itself Meta last fall.

“Expect Meta’s decline to continue until Meta can monetize the metaverse, and begin another Meta-reverse,” Shah said.

Meta’s forecast revenue of $26 billion to $28.5 billion for the current quarter, which is below Wall Street’s expectations.

“This outlook reflects a continuation of the weak advertising demand environment we experienced throughout the second quarter, which we believe is being driven by broader macroeconomic uncertainty,” finance chief David Wehner said in a statement. Meta said Wehner is being promoted to chief strategy officer, where he will oversee the company’s strategy and corporate development. Susan Li, currently vice president of finance, will replace him as CFO.

Shares of Meta Platforms Inc. fell 58 cents to $169 in after-hours trading.

Additional reporting by The Associated Press.

: newsy.com

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

How Mark Zuckerberg Is Leading Meta Into Its Next Phase

SAN FRANCISCO — Mark Zuckerberg, the founder and chief executive of the company formerly known as Facebook, called his top lieutenants for the social network to a last-minute meeting in the San Francisco Bay Area this month. On the agenda: a “work-athon” to discuss the road map for improving the main Facebook app, including a revamp that would change how users browse the service.

For weeks beforehand, Mr. Zuckerberg had sent his executives messages about the overhaul, pressing them to increase the velocity and execution of their work, people with knowledge of the matter said. Some executives — who had to read a 122-page slide deck about the changes — were beginning to sweat at the unusual level of intensity, they said.

Facebook’s leaders flew in from around the world for the summit, the people said, and Mr. Zuckerberg and the group pored over each slide. Within days, the team unveiled an update to the Facebook app to better compete with a top rival, TikTok.

trimmed perks, reshuffled his leadership team and made it clear he would cut low-performing employees. Those who are not on board are welcome to leave, he has said. Managers have sent out memos to convey the seriousness of the approach — one, which was shared with The New York Times, had the title “Operating With Increased Intensity.”

the so-called metaverse. Across Silicon Valley, he and other executives who built what many refer to as Web 2.0 — a more social, app-focused version of the internet — are rethinking and upending their original vision after their platforms were plagued by privacy stumbles, toxic content and misinformation.

The moment is reminiscent of other bet-the-company gambles, such as when Netflix killed off its DVD-mailing business last decade to focus on streaming. But Mr. Zuckerberg is making these moves as Meta’s back is against the wall. The company is staring into the barrel of a global recession. Competitors like TikTok, YouTube and Apple are bearing down.

And success is far from guaranteed. In recent months, Meta’s profits have fallen and revenue has slowed as the company has spent lavishly on the metaverse and as the economic slowdown has hurt its advertising business. Its stock has plunged.

“When Mark gets super focused on something, it becomes all hands on deck within the company,” said Katie Harbath, a former Facebook policy director and the founder of Anchor Change, a consulting firm that works on tech and democracy issues. “Teams will quickly drop other work to pivot to the issue at hand, and the pressure is intense to move fast to show progress.”

Andrew Bosworth, who is known as Boz, to chief technology officer, leading hardware efforts for the metaverse. He promoted other loyalists, too, including Javier Olivan, the new chief operating officer; Nick Clegg, who became president of global affairs; and Guy Rosen, who took on a new role of chief information security officer.

In June, Sheryl Sandberg, who was Mr. Zuckerberg’s No. 2 for 14 years, said she would step down this fall. While she spent more than a decade building Facebook’s advertising systems, she was less interested in doing the same for the metaverse, people familiar with her plans have said.

Mr. Zuckerberg has moved thousands of workers into different teams for the metaverse, training their focus on aspirational projects like hardware glasses, wearables and a new operating system for those devices.

“It’s an existential bet on where people over the next decade will connect, express and identify with one another,” said Matthew Ball, a longtime tech executive and the author of a book on the metaverse. “If you have the cash, the engineers, the users and the conviction to take a swing at that, then you should.”

But the efforts are far from cheap. Facebook’s Reality Labs division, which is building augmented and virtual reality products, has dragged down the company’s balance sheet; the hardware unit lost nearly $3 billion in the first quarter alone.

privacy changes from Apple that have hampered its ability to measure the effectiveness of ads on iPhones. TikTok, the Chinese-owned video app, has stolen young audiences from Meta’s core apps like Instagram and Facebook. These challenges are coinciding with a brutal macroeconomic environment, which has pushed Apple, Google, Microsoft and Twitter to freeze or slow hiring.

a memo last month, Chris Cox, Meta’s chief product officer, said the economic environment called for “leaner, meaner, better executing teams.”

In an employee meeting around the same time, Mr. Zuckerberg said he knew that not everyone would be on board for the changes. That was fine, he told employees.

“I think some of you might decide that this place isn’t for you, and that self-selection is OK with me,” Mr. Zuckerberg said. “Realistically, there are probably a bunch of people at the company who shouldn’t be here.”

Another memo circulated internally among workers this month was titled “Operating With Increased Intensity.” In the memo, a Meta vice president said managers should begin to “think about every person on their team and the value they are adding.”

“If a direct report is coasting or a low performer, they are not who we need; they are failing this company,” the memo said. “As a manager, you cannot allow someone to be net neutral or negative for Meta.”

investment priorities” for the company in the second half of this year.

other prototypes. Bloomberg reported earlier on the smart watch.

posted an update to his Facebook profile, noting some coming changes in the app. Facebook would start pushing people into a more video-heavy feed with more suggested content, emulating how TikTok operates.

Meta has been investing heavily in video and discovery, aiming to beef up its artificial intelligence and to improve “discovery algorithms” that suggest engaging content to users without them having to work to find it.

In the past, Facebook has tested major product updates with a few English-speaking audiences to see how they perform before rolling them out more widely. But, this time, the 2.93 billion people around the world who use the social networking app will receive the update simultaneously.

It is a sign, some Meta employees said, of just how much Mr. Zuckerberg means business.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

As Midterms Loom, Mark Zuckerberg Shifts Focus Away From Elections

Mark Zuckerberg, Facebook’s chief executive, made securing the 2020 U.S. election a top priority. He met regularly with an election team, which included more than 300 people from across his company, to prevent misinformation from spreading on the social network. He asked civil rights leaders for advice on upholding voter rights.

The core election team at Facebook, which was renamed Meta last year, has since been dispersed. Roughly 60 people are now focused primarily on elections, while others split their time on other projects. They meet with another executive, not Mr. Zuckerberg. And the chief executive has not talked recently with civil rights groups, even as some have asked him to pay more attention to the midterm elections in November.

Safeguarding elections is no longer Mr. Zuckerberg’s top concern, said four Meta employees with knowledge of the situation. Instead, he is focused on transforming his company into a provider of the immersive world of the metaverse, which he sees as the next frontier of growth, said the people, who were not authorized to speak publicly.

hearings on the Jan. 6 Capitol riot have underlined how precarious elections can be. And dozens of political candidates are running this November on the false premise that former President Donald J. Trump was robbed of the 2020 election, with social media platforms continuing to be a key way to reach American voters.

2000 Mules,” a film that falsely claims the 2020 election was stolen from Mr. Trump, was widely shared on Facebook and Instagram, garnering more than 430,000 interactions, according to an analysis by The New York Times. In posts about the film, commenters said they expected election fraud this year and warned against using mail-in voting and electronic voting machines.

$44 billion sale to Elon Musk, three employees with knowledge of the situation said. Mr. Musk has suggested that he wants fewer rules about what can and cannot be posted on the service.

barred Mr. Trump from its platforms after the riot at the U.S. Capitol on Jan. 6, 2021, has worked over the years to limit political falsehoods on its sites. Tom Reynolds, a Meta spokesman, said the company had “taken a comprehensive approach to how elections play out on our platforms since before the U.S. 2020 elections and through the dozens of global elections since then.”

recently raised doubts about the country’s electoral process. Latvia, Bosnia and Slovenia are also holding elections in October.

“People in the U.S. are almost certainly getting the Rolls-Royce treatment when it comes to any integrity on any platform, especially for U.S. elections,” said Sahar Massachi, the executive director of the think tank Integrity Institute and a former Facebook employee. “And so however bad it is here, think about how much worse it is everywhere else.”

Facebook’s role in potentially distorting elections became evident after 2016, when Russian operatives used the site to spread inflammatory content and divide American voters in the U.S. presidential election. In 2018, Mr. Zuckerberg testified before Congress that election security was his top priority.

banning QAnon conspiracy theory posts and groups in October 2020.

Around the same time, Mr. Zuckerberg and his wife, Priscilla Chan, donated $400 million to local governments to fund poll workers, pay for rental fees for polling places, provide personal protective equipment and cover other administrative costs.

The week before the November 2020 election, Meta also froze all political advertising to limit the spread of falsehoods.

But while there were successes — the company kept foreign election interference off the platform — it struggled with how to handle Mr. Trump, who used his Facebook account to amplify false claims of voter fraud. After the Jan. 6 riot, Facebook barred Mr. Trump from posting. He is eligible for reinstatement in January.

Frances Haugen, a Facebook employee turned whistle-blower, filed complaints with the Securities and Exchange Commission accusing the company of removing election safety features too soon after the 2020 election. Facebook made growth and engagement its priorities over security, she said.

fully realized digital world that exists beyond the one in which we live. It was coined by Neal Stephenson in his 1992 novel “Snow Crash,” and the concept was further explored by Ernest Cline in his novel “Ready Player One.”

Mr. Zuckerberg no longer meets weekly with those focused on election security, said the four employees, though he receives their reports. Instead, they meet with Nick Clegg, Meta’s president of global affairs.

Several civil right groups said they had noticed Meta’s shift in priorities. Mr. Zuckerberg isn’t involved in discussions with them as he once was, nor are other top Meta executives, they said.

“I’m concerned,” said Derrick Johnson, president of the National Association for the Advancement of Colored People, who talked with Mr. Zuckerberg and Sheryl Sandberg, Meta’s chief operating officer, ahead of the 2020 election. “It appears to be out of sight, out of mind.” (Ms. Sandberg has announced that she will leave Meta this fall.)

wrote a letter to Mr. Zuckerberg and the chief executives of YouTube, Twitter, Snap and other platforms. They called for them to take down posts about the lie that Mr. Trump won the 2020 election and to slow the spread of election misinformation before the midterms.

Yosef Getachew, a director at the nonprofit public advocacy organization Common Cause, whose group studied 2020 election misinformation on social media, said the companies had not responded.

“The Big Lie is front and center in the midterms with so many candidates using it to pre-emptively declare that the 2022 election will be stolen,” he said, pointing to recent tweets from politicians in Michigan and Arizona who falsely said dead people cast votes for Democrats. “Now is not the time to stop enforcing against the Big Lie.”

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

The Game Awards Returns With Glitz and an Industry Asserting Its Muscle

LOS ANGELES — Wearing blazers and bedazzled dresses, downing cocktails, swapping industry gossip, and hobnobbing with some of Hollywood’s biggest names, the stars of America’s video game industry assembled on Thursday night for a long-delayed reunion at the Game Awards.

The lavish event was a victory lap of sorts for the video game community. While the movie industry has fretted over ticket sales and cannibalization by streaming services like Netflix, the video game industry has enjoyed tremendous growth during the pandemic. An estimated 2.9 billion people — more than one out of every three people on the planet — have played a video game this year, according to the video game analytics firm Newzoo.

Thursday’s awards were also a welcome opportunity for the industry to gather under the same roof, since last year’s event was held online because of the pandemic. Gaming luminaries arrived on the red carpet at the vast Microsoft Theater in downtown Los Angeles, joined by celebrities better known for their work in other entertainment industries.

Sting, the rock music icon, was backed by an orchestra as he opened the show with a performance of the haunting song “What Could Have Been” from the Netflix series “Arcane,” which is based on the video game hit “League of Legends.” The hit band Imagine Dragons performed “Enemy,” another song featured in “Arcane.”

entertainment award events, you would also have been on the nose.

At the center of the gaming industry’s answer to the Oscars was Geoff Keighley, the video game and television personality who created and hosts the annual event and who tried, with seemingly endless reserves of energy and enthusiasm, to steer an increasingly antsy audience through more than three hours of awards presentations and trailers for upcoming games, interspersed with music from the orchestra.

The show began in 2014 and has attracted millions more eyeballs each year on YouTube and Twitch. Last year’s fully remote version garnered 83 million live streams, according to organizers, and Mr. Keighley said after Thursday’s show that he expected more people to have watched live this year, though preliminary numbers were not yet available.

bnans, said in the crowded lobby after the show. “We’ve been in quarantine for so long, but it’s really nice to actually get to hang out with everyone again and see each other after two years.”

More than two dozen awards were handed out in categories like best action game and best art direction. The most prestigious title, game of the year, went to “It Takes Two,” a two-player puzzle adventure game developed by Hazelight Studios about a married couple navigating a divorce and journeying through a fantastical world.

Microsoft’s gaming division brought home a number of awards, with “Age of Empires IV” winning best strategy game, “Halo Infinite” winning a fan award called players’ voice, and “Forza Horizon 5,” a car-racing game, taking home three honors. “Deathloop,” a first-person shooting game developed by Arkane Studios, also won multiple awards.

The winners were determined by a vote of industry insiders and the general public.

For many watching, though, the awards were just a sideshow. The Game Awards is also used by the industry to introduce new game announcements and debut trailers for upcoming titles. If audience reaction is any indication, the fantasy game “Elden Ring” continues to be one of next year’s most hotly anticipated titles.

debuted on the stock exchange, topping a $45 billion valuation on its first day of trading.

The increased mainstream interest in online worlds has also been a validation for industry insiders and gamers that were using the term “metaverse” years before Mark Zuckerberg decided that Facebook was going to change its name to Meta. Even Mr. Carrey, appearing at the awards show on a prerecorded video, joked about it.

“I’m sorry I couldn’t be there with you, but I look forward to meeting all of your avatars in the metaverse, where we can really get to know each other,” he said.

As the industry has grown, it has faced increasing challenges, none more pressing Thursday night than the treatment of its employees. A shadow was cast over the event by the scandal trailing Activision Blizzard — the game publisher that has been under fire for months following a lawsuit from California accusing it of fostering a workplace environment in which mistreatment and harassment of women was commonplace.

A handful of protesters stood with signs supporting Activision employees outside the theater Thursday evening, and Mr. Keighley faced pressure in the lead-up to the event to condemn the company.

He tweeted last week that Activision would not be a part of the awards show, and he opened the event by saying that “game creators need to be supported by the companies that employ them.”

“We should not, and will not, tolerate any abuse, harassment and predatory practices,” Mr. Keighley said, though he did not mention Activision by name. Rob Kostich, the president of Activision, is on the board of advisers for the Game Awards.

Before the event, Mr. Keighley said in an interview that he wanted to strike a balance between using his platform for good and maintaining the upbeat vibe of an awards show.

“Are we going to use our platform to take companies to task publicly inside the show? It’s always something worth thinking about,” he said, “but it’s not a referendum on the industry.”

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook Debates What to Do With Its Like and Share Buttons

SAN FRANCISCO — In 2019, Facebook researchers began a new study of one of the social network’s foundational features: the Like button.

They examined what people would do if Facebook removed the distinct thumbs-up icon and other emoji reactions from posts on its photo-sharing app Instagram, according to company documents. The buttons had sometimes caused Instagram’s youngest users “stress and anxiety,” the researchers found, especially if posts didn’t get enough Likes from friends.

But the researchers discovered that when the Like button was hidden, users interacted less with posts and ads. At the same time, it did not alleviate teenagers’ social anxiety and young users did not share more photos, as the company thought they might, leading to a mixed bag of results.

Mark Zuckerberg, Facebook’s chief executive, and other managers discussed hiding the Like button for more Instagram users, according to the documents. In the end, a larger test was rolled out in just a limited capacity to “build a positive press narrative” around Instagram.

misinformation, privacy and hate speech, a central issue has been whether the basic way that the platform works has been at fault — essentially, the features that have made Facebook be Facebook.

Apart from the Like button, Facebook has scrutinized its share button, which lets users instantly spread content posted by other people; its groups feature, which is used to form digital communities; and other tools that define how more than 3.5 billion people behave and interact online. The research, laid out in thousands of pages of internal documents, underlines how the company has repeatedly grappled with what it has created.

What researchers found was often far from positive. Time and again, they determined that people misused key features or that those features amplified toxic content, among other effects. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” — meaning the basics of how the product functioned — that had let misinformation and hate speech flourish on the site.

“The mechanics of our platform are not neutral,” they concluded.

hide posts they do not want to see and turning off political group recommendations to reduce the spread of misinformation.

But the core way that Facebook operates — a network where information can spread rapidly and where people can accumulate friends and followers and Likes — ultimately remains largely unchanged.

Many significant modifications to the social network were blocked in the service of growth and keeping users engaged, some current and former executives said. Facebook is valued at more than $900 billion.

“There’s a gap between the fact that you can have pretty open conversations inside of Facebook as an employee,” said Brian Boland, a Facebook vice president who left last year. “Actually getting change done can be much harder.”

The company documents are part of the Facebook Papers, a cache provided to the Securities and Exchange Commission and to Congress by a lawyer representing Frances Haugen, a former Facebook employee who has become a whistle-blower. Ms. Haugen earlier gave the documents to The Wall Street Journal. This month, a congressional staff member supplied the redacted disclosures to more than a dozen other news organizations, including The New York Times.

In a statement, Andy Stone, a Facebook spokesman, criticized articles based on the documents, saying that they were built on a “false premise.”

“Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie,” he said. He said Facebook had invested $13 billion and hired more than 40,000 people to keep people safe, adding that the company has called “for updated regulations where democratic governments set industry standards to which we can all adhere.”

post this month, Mr. Zuckerberg said it was “deeply illogical” that the company would give priority to harmful content because Facebook’s advertisers don’t want to buy ads on a platform that spreads hate and misinformation.

“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote.

When Mr. Zuckerberg founded Facebook 17 years ago in his Harvard University dorm room, the site’s mission was to connect people on college campuses and bring them into digital groups with common interests and locations.

Growth exploded in 2006 when Facebook introduced the News Feed, a central stream of photos, videos and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.

In 2009, Facebook introduced the Like button. The tiny thumbs-up symbol, a simple indicator of people’s preferences, became one of the social network’s most important features. The company allowed other websites to adopt the Like button so users could share their interests back to their Facebook profiles.

That gave Facebook insight into people’s activities and sentiments outside of its own site, so it could better target them with advertising. Likes also signified what users wanted to see more of in their News Feeds so people would spend more time on Facebook.

Facebook also added the groups feature, where people join private communication channels to talk about specific interests, and pages, which allowed businesses and celebrities to amass large fan bases and broadcast messages to those followers.

Adam Mosseri, the head of Instagram, has said that research on users’ well-being led to investments in anti-bullying measures on Instagram.

Yet Facebook cannot simply tweak itself so that it becomes a healthier social network when so many problems trace back to core features, said Jane Lytvynenko, a senior fellow at the Harvard Kennedy Shorenstein Center, who studies social networks and misinformation.

“When we talk about the Like button, the share button, the News Feed and their power, we’re essentially talking about the infrastructure that the network is built on top of,” she said. “The crux of the problem here is the infrastructure itself.”

As Facebook’s researchers dug into how its products worked, the worrisome results piled up.

In a July 2019 study of groups, researchers traced how members in those communities could be targeted with misinformation. The starting point, the researchers said, were people known as “invite whales,” who sent invitations out to others to join a private group.

These people were effective at getting thousands to join new groups so that the communities ballooned almost overnight, the study said. Then the invite whales could spam the groups with posts promoting ethnic violence or other harmful content, according to the study.

Another 2019 report looked at how some people accrued large followings on their Facebook pages, often using posts about cute animals and other innocuous topics. But once a page had grown to tens of thousands of followers, the founders sold it. The buyers then used the pages to show followers misinformation or politically divisive content, according to the study.

As researchers studied the Like button, executives considered hiding the feature on Facebook as well, according to the documents. In September 2019, it removed Likes from users’ Facebook posts in a small experiment in Australia.

The company wanted to see if the change would reduce pressure and social comparison among users. That, in turn, might encourage people to post more frequently to the network.

But people did not share more posts after the Like button was removed. Facebook chose not to roll the test out more broadly, noting, “Like counts are extremely low on the long list of problems we need to solve.”

Last year, company researchers also evaluated the share button. In a September 2020 study, a researcher wrote that the button and so-called reshare aggregation units in the News Feed, which are automatically generated clusters of posts that have already been shared by people’s friends, were “designed to attract attention and encourage engagement.”

But gone unchecked, the features could “serve to amplify bad content and sources,” such as bullying and borderline nudity posts, the researcher said.

That’s because the features made people less hesitant to share posts, videos and messages with one another. In fact, users were three times more likely to share any kind of content from the reshare aggregation units, the researcher said.

One post that spread widely this way was an undated message from an account called “The Angry Patriot.” The post notified users that people protesting police brutality were “targeting a police station” in Portland, Ore. After it was shared through reshare aggregation units, hundreds of hate-filled comments flooded in. It was an example of “hate bait,” the researcher said.

A common thread in the documents was how Facebook employees argued for changes in how the social network worked and often blamed executives for standing in the way.

In an August 2020 internal post, a Facebook researcher criticized the recommendation system that suggests pages and groups for people to follow and said it can “very quickly lead users down the path to conspiracy theories and groups.”

“Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms,” the researcher wrote. “During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole” of conspiracy theory movements like QAnon and anti-vaccination and Covid-19 conspiracies.

The researcher added, “It has been painful to observe.”

Reporting was contributed by Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

In India, Facebook Struggles to Combat Misinformation and Hate Speech

On Feb. 4, 2019, a Facebook researcher created a new user account to see what it was like to experience the social media site as a person living in Kerala, India.

For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook’s algorithms to join groups, watch videos and explore new pages on the site.

The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month.

bots and fake accounts tied to the country’s ruling party and opposition figures were wreaking havoc on national elections. They also detail how a plan championed by Mark Zuckerberg, Facebook’s chief executive, to focus on “meaningful social interactions,” or exchanges between friends and family, was leading to more misinformation in India, particularly during the pandemic.

a violent coup in the country. Facebook said that after the coup, it implemented a special policy to remove praise and support of violence in the country, and later banned the Myanmar military from Facebook and Instagram.

In Sri Lanka, people were able to automatically add hundreds of thousands of users to Facebook groups, exposing them to violence-inducing and hateful content. In Ethiopia, a nationalist youth militia group successfully coordinated calls for violence on Facebook and posted other inflammatory content.

Facebook has invested significantly in technology to find hate speech in various languages, including Hindi and Bengali, two of the most widely used languages, Mr. Stone said. He added that Facebook reduced the amount of hate speech that people see globally by half this year.

suicide bombing in the disputed border region of Kashmir set off a round of violence and a spike in accusations, misinformation and conspiracies between Indian and Pakistani nationals.

After the attack, anti-Pakistan content began to circulate in the Facebook-recommended groups that the researcher had joined. Many of the groups, she noted, had tens of thousands of users. A different report by Facebook, published in December 2019, found Indian Facebook users tended to join large groups, with the country’s median group size at 140,000 members.

Graphic posts, including a meme showing the beheading of a Pakistani national and dead bodies wrapped in white sheets on the ground, circulated in the groups she joined.

After the researcher shared her case study with co-workers, her colleagues commented on the posted report that they were concerned about misinformation about the upcoming elections in India.

Two months later, after India’s national elections had begun, Facebook put in place a series of steps to stem the flow of misinformation and hate speech in the country, according to an internal document called Indian Election Case Study.

The case study painted an optimistic picture of Facebook’s efforts, including adding more fact-checking partners — the third-party network of outlets with which Facebook works to outsource fact-checking — and increasing the amount of misinformation it removed. It also noted how Facebook had created a “political white list to limit P.R. risk,” essentially a list of politicians who received a special exemption from fact-checking.

The study did not note the immense problem the company faced with bots in India, nor issues like voter suppression. During the election, Facebook saw a spike in bots — or fake accounts — linked to various political groups, as well as efforts to spread misinformation that could have affected people’s understanding of the voting process.

In a separate report produced after the elections, Facebook found that over 40 percent of top views, or impressions, in the Indian state of West Bengal were “fake/inauthentic.” One inauthentic account had amassed more than 30 million impressions.

A report published in March 2021 showed that many of the problems cited during the 2019 elections persisted.

In the internal document, called Adversarial Harmful Networks: India Case Study, Facebook researchers wrote that there were groups and pages “replete with inflammatory and misleading anti-Muslim content” on Facebook.

The report said there were a number of dehumanizing posts comparing Muslims to “pigs” and “dogs,” and misinformation claiming that the Quran, the holy book of Islam, calls for men to rape their female family members.

Much of the material circulated around Facebook groups promoting Rashtriya Swayamsevak Sangh, an Indian right-wing and nationalist group with close ties to India’s ruling Bharatiya Janata Party, or B.J.P. The groups took issue with an expanding Muslim minority population in West Bengal and near the Pakistani border, and published posts on Facebook calling for the ouster of Muslim populations from India and promoting a Muslim population control law.

Facebook knew that such harmful posts proliferated on its platform, the report indicated, and it needed to improve its “classifiers,” which are automated systems that can detect and remove posts containing violent and inciting language. Facebook also hesitated to designate R.S.S. as a dangerous organization because of “political sensitivities” that could affect the social network’s operation in the country.

Of India’s 22 officially recognized languages, Facebook said it has trained its A.I. systems on five. (It said it had human reviewers for some others.) But in Hindi and Bengali, it still did not have enough data to adequately police the content, and much of the content targeting Muslims “is never flagged or actioned,” the Facebook report said.

Five months ago, Facebook was still struggling to efficiently remove hate speech against Muslims. Another company report detailed efforts by Bajrang Dal, an extremist group linked with the B.J.P., to publish posts containing anti-Muslim narratives on the platform.

Facebook is considering designating the group as a dangerous organization because it is “inciting religious violence” on the platform, the document showed. But it has not yet done so.

“Join the group and help to run the group; increase the number of members of the group, friends,” said one post seeking recruits on Facebook to spread Bajrang Dal’s messages. “Fight for truth and justice until the unjust are destroyed.”

Ryan Mac, Cecilia Kang and Mike Isaac contributed reporting.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Whistle-Blower Says Facebook ‘Chooses Profits Over Safety’

John Tye, the founder of Whistleblower Aid, a legal nonprofit that represents people seeking to expose potential lawbreaking, was contacted this spring through a mutual connection by a woman who claimed to have worked at Facebook.

The woman told Mr. Tye and his team something intriguing: She had access to tens of thousands of pages of internal documents from the world’s largest social network. In a series of calls, she asked for legal protection and a path to releasing the confidential information. Mr. Tye, who said he understood the gravity of what the woman brought “within a few minutes,” agreed to represent her and call her by the alias “Sean.”

She “is a very courageous person and is taking a personal risk to hold a trillion-dollar company accountable,” he said.

On Sunday, Frances Haugen revealed herself to be “Sean,” the whistle-blower against Facebook. A product manager who worked for nearly two years on the civic misinformation team at the social network before leaving in May, Ms. Haugen has used the documents she amassed to expose how much Facebook knew about the harms that it was causing and provided the evidence to lawmakers, regulators and the news media.

knew Instagram was worsening body image issues among teenagers and that it had a two-tier justice system — have spurred criticism from lawmakers, regulators and the public.

Ms. Haugen has also filed a whistle-blower complaint with the Securities and Exchange Commission, accusing Facebook of misleading investors with public statements that did not match its internal actions. And she has talked with lawmakers such as Senator Richard Blumenthal, a Democrat of Connecticut, and Senator Marsha Blackburn, a Republican of Tennessee, and shared subsets of the documents with them.

The spotlight on Ms. Haugen is set to grow brighter. On Tuesday, she is scheduled to testify in Congress about Facebook’s impact on young users.

misinformation and hate speech.

In 2018, Christopher Wylie, a disgruntled former employee of the consulting firm Cambridge Analytica, set the stage for those leaks. Mr. Wylie spoke with The New York Times, The Observer of London and The Guardian to reveal that Cambridge Analytica had improperly harvested Facebook data to build voter profiles without users’ consent.

In the aftermath, more of Facebook’s own employees started speaking up. Later that same year, Facebook workers provided executive memos and planning documents to news outlets including The Times and BuzzFeed News. In mid-2020, employees who disagreed with Facebook’s decision to leave up a controversial post from President Donald J. Trump staged a virtual walkout and sent more internal information to news outlets.

“I think over the last year, there’ve been more leaks than I think all of us would have wanted,” Mark Zuckerberg, Facebook’s chief executive, said in a meeting with employees in June 2020.

Facebook tried to preemptively push back against Ms. Haugen. On Friday, Nick Clegg, Facebook’s vice president for policy and global affairs, sent employees a 1,500-word memo laying out what the whistle-blower was likely to say on “60 Minutes” and calling the accusations “misleading.” On Sunday, Mr. Clegg appeared on CNN to defend the company, saying the platform reflected “the good, the bad and ugly of humanity” and that it was trying to “mitigate the bad, reduce it and amplify the good.”

personal website. On the website, Ms. Haugen was described as “an advocate for public oversight of social media.”

A native of Iowa City, Iowa, Ms. Haugen studied electrical and computer engineering at Olin College and got an M.B.A. from Harvard, the website said. She then worked on algorithms at Google, Pinterest and Yelp. In June 2019, she joined Facebook. There, she handled democracy and misinformation issues, as well as working on counterespionage, according to the website.

filed an antitrust suit against Facebook. In a video posted by Whistleblower Aid on Sunday, Ms. Haugen said she did not believe breaking up Facebook would solve the problems inherent at the company.

“The path forward is about transparency and governance,” she said in the video. “It’s not about breaking up Facebook.”

Ms. Haugen has also spoken to lawmakers in France and Britain, as well as a member of European Parliament. This month, she is scheduled to appear before a British parliamentary committee. That will be followed by stops at Web Summit, a technology conference in Lisbon, and in Brussels to meet with European policymakers in November, Mr. Tye said.

On Sunday, a GoFundMe page that Whistleblower Aid created for Ms. Haugen also went live. Noting that Facebook had “limitless resources and an army of lawyers,” the group set a goal of raising $10,000. Within 30 minutes, 18 donors had given $1,195. Shortly afterward, the fund-raising goal was increased to $50,000.

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Facebook Said to Consider Forming an Election Commission

Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body.

The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart.

Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook’s chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.

Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own.

pays them through a trust.

The Oversight Board’s highest-profile decision was reviewing Facebook’s suspension of former President Donald J. Trump after the Jan. 6 storming of the U.S. Capitol. At the time, Facebook opted to ban Mr. Trump’s account indefinitely, a penalty that the Oversight Board later deemed “not appropriate” because the time frame was not based on any of the company’s rules. The board asked Facebook to try again.

In June, Facebook responded by saying that it would bar Mr. Trump from the platform for at least two years. The Oversight Board has separately weighed in on more than a dozen other content cases that it calls “highly emblematic” of broader themes that Facebook grapples with regularly, including whether certain Covid-related posts should remain up on the network and hate speech issues in Myanmar.

A spokesman for the Oversight Board declined to comment.

Facebook has had a spotty track record on election-related issues, going back to Russian manipulation of the platform’s advertising and posts in the 2016 presidential election.

bar the purchase of new political ads the week before the election, then later decided to temporarily ban all U.S. political advertising after the polls closed on Election Day, causing an uproar among candidates and ad-buying firms.

The company has struggled with how to handle lies and hate speech around elections. During his last year in office, Mr. Trump used Facebook to suggest he would use state violence against protesters in Minneapolis ahead of the 2020 election, while casting doubt on the electoral process as votes were tallied in November. Facebook initially said that what political leaders posted was newsworthy and should not be touched, before later reversing course.

The social network has also faced difficulties in elections elsewhere, including the proliferation of targeted disinformation across its WhatsApp messaging service during the Brazilian presidential election in 2018. In 2019, Facebook removed hundreds of misleading pages and accounts associated with political parties in India ahead of the country’s national elections.

Facebook has tried various methods to stem the criticisms. It established a political ads library to increase transparency around buyers of those promotions. It also has set up war rooms to monitor elections for disinformation to prevent interference.

There are several elections in the coming year in countries such as Hungary, Germany, Brazil and the Philippines where Facebook’s actions will be closely scrutinized. Voter fraud misinformation has already begun spreading ahead of German elections in September. In the Philippines, Facebook has removed networks of fake accounts that support President Rodrigo Duterte, who used the social network to gain power in 2016.

“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” said Nathaniel Persily, a law professor at Stanford University. “Whatever decisions Facebook makes have global implications.”

Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter.

An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.

Tatenda Musapatike, who previously worked on elections at Facebook and now runs a nonprofit voter registration organization, said that many have lost faith in the company’s abilities to work with political campaigns. But the election commission proposal was “a good step,” she said, because “they’re doing something and they’re not saying we alone can handle it.”

View

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<