Facebook Debates What to Do With Its Like and Share Buttons

SAN FRANCISCO — In 2019, Facebook researchers began a new study of one of the social network’s foundational features: the Like button.

They examined what people would do if Facebook removed the distinct thumbs-up icon and other emoji reactions from posts on its photo-sharing app Instagram, according to company documents. The buttons had sometimes caused Instagram’s youngest users “stress and anxiety,” the researchers found, especially if posts didn’t get enough Likes from friends.

But the researchers discovered that when the Like button was hidden, users interacted less with posts and ads. At the same time, it did not alleviate teenagers’ social anxiety and young users did not share more photos, as the company thought they might, leading to a mixed bag of results.

Mark Zuckerberg, Facebook’s chief executive, and other managers discussed hiding the Like button for more Instagram users, according to the documents. In the end, a larger test was rolled out in just a limited capacity to “build a positive press narrative” around Instagram.

misinformation, privacy and hate speech, a central issue has been whether the basic way that the platform works has been at fault — essentially, the features that have made Facebook be Facebook.

Apart from the Like button, Facebook has scrutinized its share button, which lets users instantly spread content posted by other people; its groups feature, which is used to form digital communities; and other tools that define how more than 3.5 billion people behave and interact online. The research, laid out in thousands of pages of internal documents, underlines how the company has repeatedly grappled with what it has created.

What researchers found was often far from positive. Time and again, they determined that people misused key features or that those features amplified toxic content, among other effects. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” — meaning the basics of how the product functioned — that had let misinformation and hate speech flourish on the site.

“The mechanics of our platform are not neutral,” they concluded.

hide posts they do not want to see and turning off political group recommendations to reduce the spread of misinformation.

But the core way that Facebook operates — a network where information can spread rapidly and where people can accumulate friends and followers and Likes — ultimately remains largely unchanged.

Many significant modifications to the social network were blocked in the service of growth and keeping users engaged, some current and former executives said. Facebook is valued at more than $900 billion.

“There’s a gap between the fact that you can have pretty open conversations inside of Facebook as an employee,” said Brian Boland, a Facebook vice president who left last year. “Actually getting change done can be much harder.”

The company documents are part of the Facebook Papers, a cache provided to the Securities and Exchange Commission and to Congress by a lawyer representing Frances Haugen, a former Facebook employee who has become a whistle-blower. Ms. Haugen earlier gave the documents to The Wall Street Journal. This month, a congressional staff member supplied the redacted disclosures to more than a dozen other news organizations, including The New York Times.

In a statement, Andy Stone, a Facebook spokesman, criticized articles based on the documents, saying that they were built on a “false premise.”

“Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or well-being misunderstands where our own commercial interests lie,” he said. He said Facebook had invested $13 billion and hired more than 40,000 people to keep people safe, adding that the company has called “for updated regulations where democratic governments set industry standards to which we can all adhere.”

post this month, Mr. Zuckerberg said it was “deeply illogical” that the company would give priority to harmful content because Facebook’s advertisers don’t want to buy ads on a platform that spreads hate and misinformation.

“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote.

When Mr. Zuckerberg founded Facebook 17 years ago in his Harvard University dorm room, the site’s mission was to connect people on college campuses and bring them into digital groups with common interests and locations.

Growth exploded in 2006 when Facebook introduced the News Feed, a central stream of photos, videos and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.

In 2009, Facebook introduced the Like button. The tiny thumbs-up symbol, a simple indicator of people’s preferences, became one of the social network’s most important features. The company allowed other websites to adopt the Like button so users could share their interests back to their Facebook profiles.

That gave Facebook insight into people’s activities and sentiments outside of its own site, so it could better target them with advertising. Likes also signified what users wanted to see more of in their News Feeds so people would spend more time on Facebook.

Facebook also added the groups feature, where people join private communication channels to talk about specific interests, and pages, which allowed businesses and celebrities to amass large fan bases and broadcast messages to those followers.

Adam Mosseri, the head of Instagram, has said that research on users’ well-being led to investments in anti-bullying measures on Instagram.

Yet Facebook cannot simply tweak itself so that it becomes a healthier social network when so many problems trace back to core features, said Jane Lytvynenko, a senior fellow at the Harvard Kennedy Shorenstein Center, who studies social networks and misinformation.

“When we talk about the Like button, the share button, the News Feed and their power, we’re essentially talking about the infrastructure that the network is built on top of,” she said. “The crux of the problem here is the infrastructure itself.”

As Facebook’s researchers dug into how its products worked, the worrisome results piled up.

In a July 2019 study of groups, researchers traced how members in those communities could be targeted with misinformation. The starting point, the researchers said, were people known as “invite whales,” who sent invitations out to others to join a private group.

These people were effective at getting thousands to join new groups so that the communities ballooned almost overnight, the study said. Then the invite whales could spam the groups with posts promoting ethnic violence or other harmful content, according to the study.

Another 2019 report looked at how some people accrued large followings on their Facebook pages, often using posts about cute animals and other innocuous topics. But once a page had grown to tens of thousands of followers, the founders sold it. The buyers then used the pages to show followers misinformation or politically divisive content, according to the study.

As researchers studied the Like button, executives considered hiding the feature on Facebook as well, according to the documents. In September 2019, it removed Likes from users’ Facebook posts in a small experiment in Australia.

The company wanted to see if the change would reduce pressure and social comparison among users. That, in turn, might encourage people to post more frequently to the network.

But people did not share more posts after the Like button was removed. Facebook chose not to roll the test out more broadly, noting, “Like counts are extremely low on the long list of problems we need to solve.”

Last year, company researchers also evaluated the share button. In a September 2020 study, a researcher wrote that the button and so-called reshare aggregation units in the News Feed, which are automatically generated clusters of posts that have already been shared by people’s friends, were “designed to attract attention and encourage engagement.”

But gone unchecked, the features could “serve to amplify bad content and sources,” such as bullying and borderline nudity posts, the researcher said.

That’s because the features made people less hesitant to share posts, videos and messages with one another. In fact, users were three times more likely to share any kind of content from the reshare aggregation units, the researcher said.

One post that spread widely this way was an undated message from an account called “The Angry Patriot.” The post notified users that people protesting police brutality were “targeting a police station” in Portland, Ore. After it was shared through reshare aggregation units, hundreds of hate-filled comments flooded in. It was an example of “hate bait,” the researcher said.

A common thread in the documents was how Facebook employees argued for changes in how the social network worked and often blamed executives for standing in the way.

In an August 2020 internal post, a Facebook researcher criticized the recommendation system that suggests pages and groups for people to follow and said it can “very quickly lead users down the path to conspiracy theories and groups.”

“Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms,” the researcher wrote. “During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole” of conspiracy theory movements like QAnon and anti-vaccination and Covid-19 conspiracies.

The researcher added, “It has been painful to observe.”

Reporting was contributed by Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

India and Israel Inflame Facebook’s Fights With Its Own Employees

SAN FRANCISCO — When India’s government ordered Facebook and other tech companies to take down posts critical of its handling of the coronavirus pandemic in April, the social network complied on some posts.

But once it did, its employees flocked to online chat rooms to ask why Facebook had helped Prime Minister Narendra Modi of India stifle dissent. In one internal post, which was reviewed by The New York Times, an employee with family in India accused Facebook of “being afraid” that Mr. Modi would ban the company from doing business in the country. “We can’t act or make decisions out of fear,” he wrote.

Weeks later, when clashes broke out in Israel between Israelis and Palestinians, Facebook removed posts from prominent Palestinian activists and briefly banned hashtags related to the violence. Facebook employees again took to the message boards to ask why their company now appeared to be censoring pro-Palestinian content.

“It just feels like, once again, we are erring on the side of a populist government and making decisions due to politics, not policies,” one worker wrote in an internal message that was reviewed by The Times.

inflammatory posts from former President Donald J. Trump. But since Mr. Trump left office in January, attention has shifted to Facebook’s global policies and what employees said was the company’s acquiescence to governments so that it could continue profiting in those countries.

“There’s a feeling among people at Facebook that this is a systematic approach, one which favors strong government leaders over the principles of doing what is right and correct,” said Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa region, who left in 2017.

Facebook is increasingly caught in a vise. In India, Russia and elsewhere, governments are pressuring it to remove content as they try to corral the platform’s power over online speech. But when Facebook complies with the takedown orders, it has upset its own employees, who say the social network has helped authoritarian leaders and repressive regimes quash activists and silence marginalized communities.

BuzzFeed News and the Financial Times earlier reported on some of the employee dissatisfaction at Facebook over Israeli and Palestinian content.

A divide between Facebook’s employees and the global policy team, which is composed of roughly 1,000 employees, has existed for years, current and former workers said. The policy team reports to Sheryl Sandberg, the chief operating officer.

many tricky international situations over the years, including in Russia, Vietnam and Myanmar, where it has had to consider whether it would be shut down if it did not work with governments. That has led to the employee dissent, which has begun spilling into public view.

That became evident with India. In April, as Covid-19 cases soared in the country, Mr. Modi’s government called for roughly 100 social media posts on Facebook, Instagram and Twitter to be pulled down. Many of the posts included critiques of the government from opposition politicians and calls for Mr. Modi’s resignation.

Facebook removed some of the posts and briefly blocked a hashtag, #ResignModi. The company later said the hashtag had been banned by mistake and was not part of a government request.

But internally, the damage was done. In online chat rooms dedicated to human rights issues and global policy, employees described how disappointed they were with Facebook’s actions. Some shared stories of family members in India who were worried they were being censored.

Last month, when violence broke out between Israelis and Palestinians, reports surfaced that Facebook had erased content from Palestinian activists. Facebook’s Instagram app also briefly banned the #AlAqsa hashtag, a reference to Al Aqsa Mosque, one of Islam’s holiest sites. Facebook later explained that it had confused the #AlAqsa hashtag with a Palestinian militant group called Al Aqsa Martyrs Brigade.

Employees bristled. “We are responding to people’s protests about censoring with more censoring?” one wrote in an internal message, which was reviewed by The Times.

Nick Clegg, who leads public affairs, to explain the company’s role in removing content tied to the Israeli-Palestinian conflict, according to attendees. The employee called the situation in Israel “fraught” and asked how Facebook was going “to get it right” with content moderation.

Mr. Clegg ran through a list of policy rules and plans going forward, and assured staff that moderation would be treated with fairness and responsibility, two people familiar with the meeting said. The discussion was cordial, one of the people said, and comments in the chat box beside Mr. Clegg’s response were largely positive.

But some employees were dissatisfied, the people said. As Mr. Clegg spoke, they broke off into private chats and workplace groups, known as Tribes, to discuss what to do.

Dozens of employees later formed a group to flag the Palestinian content that they said had been suppressed to internal content moderation teams, said two employees. The goal was to have the posts reinstated online, they said.

Members of Facebook’s policy team have tried calming the tensions. In an internal memo in mid-May, which was reviewed by The Times, two policy team members wrote to other employees that they hoped “that Facebook’s internal community will resist succumbing to the division and demonization of the other side that is so brutally playing itself out offline and online.”

One of them was Muslim, and the other was Jewish, they said.

“We don’t always agree,” they wrote. “However, we do some of our best work when we assume good intent and recognize that we are on the same side trying to serve our community in the best possible way.”

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Clubhouse App Creates Space for Open Talk in Middle East

Clubhouse policy bans users from recording conversations without participants’ consent, but the company says it temporarily records audio for investigating reports of policy violations. It has not specified who can listen to such recordings, or when.

A Clubhouse spokeswoman declined to comment.

Yet something about the spontaneous, intimate nature of the conversations — open to everyone regardless of fame or follower count — keeps lassoing people in. Away from government propaganda, Clubhouse is allowing Qataris unfettered access to their Saudi neighbors after years of bitter feuding between their countries and Egyptians access to Muslim Brotherhood defenders.

“People have been longing for this kind of communication for a long time, but I don’t think they realized it until they started using Clubhouse,” said Tharwat Abaza, 28, an Egyptian dentist who said he had listened to rooms discussing sexual harassment, feminism, the need for sex education in Arab countries and mental health. “At this point, it’s one of the freest platforms, and it’s giving us room for important discussions that we should be having without fear of witch hunting.”

There are, of course, many less charged Clubhouse rooms in the Middle East, discussing the cuteness of penguins, entrepreneurship, recipes, breakups and music. During the holy month of Ramadan, users in some countries are offering live recitations of the Quran and communal prayer rooms.

But if Clubhouse can function as group therapy, talk show, house party or seminar, it stands out for its political potential.

In Iran, despite predictions of low turnout ahead of its June 18 presidential election, election-focused Clubhouse rooms are among the most popular. Thousands participate daily at a time when in-person campaigning is limited by the pandemic.

View Source

Pinterest Is Said to Be in Talks to Acquire the Photo App VSCO

SAN FRANCISCO — Pinterest has held talks to buy VSCO, a photography app that spawned a teenage social media craze, according to two people with knowledge of the matter.

The discussions are ongoing, said the people, who declined to be identified because they were not authorized to speak publicly. A deal price couldn’t be learned; Pinterest has a market capitalization of about $49 billion, while VSCO has raised $90 million in funding and was last valued at $550 million. An acquisition may not materialize, the people cautioned.

Representatives from Pinterest and VSCO (pronounced “vis-coe”) declined to comment on deal talks.

Julie Inouye, a spokeswoman for VSCO, said the company was focused on expanding its business. “We’re always meeting with different companies across the creative space at any given time and do not discuss rumors or speculation,” she said.

Pinterest and VSCO, which stands for Visual Supply Company, are part of a group of tech companies that are highly focused on digital images and visual editing and that rely less on social networking features. Pinterest, a digital pin board site that went public in 2019, lets its users discover and save images to inspire creative projects or to plan important aspects of their lives, including home renovations, weddings and meals.

an app for editing and sharing images and videos. In 2019, it became popular with a Generation Z group that came to be known as “VSCO girls,” who were known for wearing Crocs and carrying Hydro Flasks. The idea of VSCO girls went viral, inspiring social media imitation, mockery, memes and Halloween costumes.

For Pinterest, buying a once-buzzy start-up that was popular with younger audiences and that has expertise in photo- and video-editing technologies could bolster its core service, the people said.

Since Pinterest went public, its revenue has grown, though analysts have said they don’t expect Pinterest to become regularly profitable until 2022. It has also expanded internationally.

During the pandemic, the company experienced a surge of interest as people were locked down and turned to more digital activities. Pinterest added 100 million monthly active users last year and now has a total of 450 million monthly active users.

The San Francisco company also faced social unrest last year. In December, it agreed to pay $22.5 million to settle a gender discrimination and retaliation lawsuit from its former chief operating officer, one of the largest publicly announced individual settlements for gender discrimination. Two female employees of color who quit last year also publicly discussed their experiences with racist and sexist comments, pay inequities and retaliation at the company.

Founded in 2011, VSCO became known among younger users as a kind of anti-social network. The app does not have likes, comments or follower counts, so it appeared to put less pressure on users to build up a fan base. VSCO also eschews advertising, instead earning money by charging people for extra features. Of its 100 million registered users, more than two million are paying subscribers.

When VSCO girls became a cultural phenomenon in late 2019, investor interest in the start-up swelled. But the fad has since cooled off. When the pandemic hit, VSCO laid off 30 percent of its employees. In December, it acquired Trash, a mobile app for video editing, and said it planned to continue acquiring companies in 2021.

View Source