post this month, Mr. Zuckerberg said it was “deeply illogical” that the company would give priority to harmful content because Facebook’s advertisers don’t want to buy ads on a platform that spreads hate and misinformation.

“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” he wrote.

When Mr. Zuckerberg founded Facebook 17 years ago in his Harvard University dorm room, the site’s mission was to connect people on college campuses and bring them into digital groups with common interests and locations.

Growth exploded in 2006 when Facebook introduced the News Feed, a central stream of photos, videos and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.

In 2009, Facebook introduced the Like button. The tiny thumbs-up symbol, a simple indicator of people’s preferences, became one of the social network’s most important features. The company allowed other websites to adopt the Like button so users could share their interests back to their Facebook profiles.

That gave Facebook insight into people’s activities and sentiments outside of its own site, so it could better target them with advertising. Likes also signified what users wanted to see more of in their News Feeds so people would spend more time on Facebook.

Facebook also added the groups feature, where people join private communication channels to talk about specific interests, and pages, which allowed businesses and celebrities to amass large fan bases and broadcast messages to those followers.

Adam Mosseri, the head of Instagram, has said that research on users’ well-being led to investments in anti-bullying measures on Instagram.

Yet Facebook cannot simply tweak itself so that it becomes a healthier social network when so many problems trace back to core features, said Jane Lytvynenko, a senior fellow at the Harvard Kennedy Shorenstein Center, who studies social networks and misinformation.

“When we talk about the Like button, the share button, the News Feed and their power, we’re essentially talking about the infrastructure that the network is built on top of,” she said. “The crux of the problem here is the infrastructure itself.”

As Facebook’s researchers dug into how its products worked, the worrisome results piled up.

In a July 2019 study of groups, researchers traced how members in those communities could be targeted with misinformation. The starting point, the researchers said, were people known as “invite whales,” who sent invitations out to others to join a private group.

These people were effective at getting thousands to join new groups so that the communities ballooned almost overnight, the study said. Then the invite whales could spam the groups with posts promoting ethnic violence or other harmful content, according to the study.

Another 2019 report looked at how some people accrued large followings on their Facebook pages, often using posts about cute animals and other innocuous topics. But once a page had grown to tens of thousands of followers, the founders sold it. The buyers then used the pages to show followers misinformation or politically divisive content, according to the study.

As researchers studied the Like button, executives considered hiding the feature on Facebook as well, according to the documents. In September 2019, it removed Likes from users’ Facebook posts in a small experiment in Australia.

The company wanted to see if the change would reduce pressure and social comparison among users. That, in turn, might encourage people to post more frequently to the network.

But people did not share more posts after the Like button was removed. Facebook chose not to roll the test out more broadly, noting, “Like counts are extremely low on the long list of problems we need to solve.”

Last year, company researchers also evaluated the share button. In a September 2020 study, a researcher wrote that the button and so-called reshare aggregation units in the News Feed, which are automatically generated clusters of posts that have already been shared by people’s friends, were “designed to attract attention and encourage engagement.”

But gone unchecked, the features could “serve to amplify bad content and sources,” such as bullying and borderline nudity posts, the researcher said.

That’s because the features made people less hesitant to share posts, videos and messages with one another. In fact, users were three times more likely to share any kind of content from the reshare aggregation units, the researcher said.

One post that spread widely this way was an undated message from an account called “The Angry Patriot.” The post notified users that people protesting police brutality were “targeting a police station” in Portland, Ore. After it was shared through reshare aggregation units, hundreds of hate-filled comments flooded in. It was an example of “hate bait,” the researcher said.

A common thread in the documents was how Facebook employees argued for changes in how the social network worked and often blamed executives for standing in the way.

In an August 2020 internal post, a Facebook researcher criticized the recommendation system that suggests pages and groups for people to follow and said it can “very quickly lead users down the path to conspiracy theories and groups.”

“Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms,” the researcher wrote. “During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole” of conspiracy theory movements like QAnon and anti-vaccination and Covid-19 conspiracies.

The researcher added, “It has been painful to observe.”

Reporting was contributed by Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

What Happened When Facebook Employees Warned About Election Misinformation

WHAT HAPPENED

1. From Wednesday through Saturday there was a lot of content circulating which implied fraud in the election, at around 10% of all civic content and 1-2% of all US VPVs. There was also a fringe of incitement to violence.

2. There were dozens of employees monitoring this, and FB launched ~15 measures prior to the election, and another ~15 in the days afterwards. Most of the measures made existings processes more aggressive: e.g. by lowering thresholds, by making penalties more severe, or expanding eligibility for existing measures. Some measures were qualitative: reclassifying certain types of content as violating, which had not been before.

3. I would guess these measures reduced prevalence of violating content by at least 2X. However they had collateral damage (removing and demoting non-violating content), and the episode caused noticeable resentment by Republican Facebook users who feel they are being unfairly targeted.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Instagram Struggles With Fears of Losing Its ‘Pipeline’: Young Users

Facebook knew that an ad intended for a 13-year-old was likely to capture younger children who wanted to mimic their older siblings and friends, one person said. Managers told employees that Facebook did everything it could to stop underage users from joining Instagram, but that it could not be helped if they signed up anyway.

In September 2018, Kevin Systrom and Mike Krieger, Instagram’s founders, left Facebook after clashing with Mr. Zuckerberg. Mr. Mosseri, a longtime Facebook executive, was appointed to helm Instagram.

With the leadership changes, Facebook went all out to turn Instagram into a main attraction for young audiences, four former employees said. That coincided with the realization that Facebook itself, which was grappling with data privacy and other scandals, would never be a teen destination, the people said.

Instagram began concentrating on the “teen time spent” data point, three former employees said. The goal was to drive up the amount of time that teenagers were on the app with features including Instagram Live, a broadcasting tool, and Instagram TV, where people upload videos that run as long as an hour.

Instagram also increased its global marketing budget. In 2018, it allocated $67.2 million to marketing. In 2019, that increased to a planned $127.3 million, then to $186.3 million last year and $390 million this year, according to the internal documents. Most of the budgets were designated to wooing teens, the documents show. Mr. Mosseri approved the budgets, two employees said.

The money was slated for marketing categories like “establishing Instagram as the favorite place for teens to express themselves” and cultural programs for events like the Super Bowl, according to the documents.

Many of the resulting ads were digital, featuring some of the platform’s top influencers, such as Donté Colley, a Canadian dancer and creator. The marketing, when put into action, also targeted parents of teenagers and people up to the age of 34.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Whistle-Blower Says Facebook ‘Chooses Profits Over Safety’

John Tye, the founder of Whistleblower Aid, a legal nonprofit that represents people seeking to expose potential lawbreaking, was contacted this spring through a mutual connection by a woman who claimed to have worked at Facebook.

The woman told Mr. Tye and his team something intriguing: She had access to tens of thousands of pages of internal documents from the world’s largest social network. In a series of calls, she asked for legal protection and a path to releasing the confidential information. Mr. Tye, who said he understood the gravity of what the woman brought “within a few minutes,” agreed to represent her and call her by the alias “Sean.”

She “is a very courageous person and is taking a personal risk to hold a trillion-dollar company accountable,” he said.

On Sunday, Frances Haugen revealed herself to be “Sean,” the whistle-blower against Facebook. A product manager who worked for nearly two years on the civic misinformation team at the social network before leaving in May, Ms. Haugen has used the documents she amassed to expose how much Facebook knew about the harms that it was causing and provided the evidence to lawmakers, regulators and the news media.

knew Instagram was worsening body image issues among teenagers and that it had a two-tier justice system — have spurred criticism from lawmakers, regulators and the public.

Ms. Haugen has also filed a whistle-blower complaint with the Securities and Exchange Commission, accusing Facebook of misleading investors with public statements that did not match its internal actions. And she has talked with lawmakers such as Senator Richard Blumenthal, a Democrat of Connecticut, and Senator Marsha Blackburn, a Republican of Tennessee, and shared subsets of the documents with them.

The spotlight on Ms. Haugen is set to grow brighter. On Tuesday, she is scheduled to testify in Congress about Facebook’s impact on young users.

misinformation and hate speech.

In 2018, Christopher Wylie, a disgruntled former employee of the consulting firm Cambridge Analytica, set the stage for those leaks. Mr. Wylie spoke with The New York Times, The Observer of London and The Guardian to reveal that Cambridge Analytica had improperly harvested Facebook data to build voter profiles without users’ consent.

In the aftermath, more of Facebook’s own employees started speaking up. Later that same year, Facebook workers provided executive memos and planning documents to news outlets including The Times and BuzzFeed News. In mid-2020, employees who disagreed with Facebook’s decision to leave up a controversial post from President Donald J. Trump staged a virtual walkout and sent more internal information to news outlets.

“I think over the last year, there’ve been more leaks than I think all of us would have wanted,” Mark Zuckerberg, Facebook’s chief executive, said in a meeting with employees in June 2020.

Facebook tried to preemptively push back against Ms. Haugen. On Friday, Nick Clegg, Facebook’s vice president for policy and global affairs, sent employees a 1,500-word memo laying out what the whistle-blower was likely to say on “60 Minutes” and calling the accusations “misleading.” On Sunday, Mr. Clegg appeared on CNN to defend the company, saying the platform reflected “the good, the bad and ugly of humanity” and that it was trying to “mitigate the bad, reduce it and amplify the good.”

personal website. On the website, Ms. Haugen was described as “an advocate for public oversight of social media.”

A native of Iowa City, Iowa, Ms. Haugen studied electrical and computer engineering at Olin College and got an M.B.A. from Harvard, the website said. She then worked on algorithms at Google, Pinterest and Yelp. In June 2019, she joined Facebook. There, she handled democracy and misinformation issues, as well as working on counterespionage, according to the website.

filed an antitrust suit against Facebook. In a video posted by Whistleblower Aid on Sunday, Ms. Haugen said she did not believe breaking up Facebook would solve the problems inherent at the company.

“The path forward is about transparency and governance,” she said in the video. “It’s not about breaking up Facebook.”

Ms. Haugen has also spoken to lawmakers in France and Britain, as well as a member of European Parliament. This month, she is scheduled to appear before a British parliamentary committee. That will be followed by stops at Web Summit, a technology conference in Lisbon, and in Brussels to meet with European policymakers in November, Mr. Tye said.

On Sunday, a GoFundMe page that Whistleblower Aid created for Ms. Haugen also went live. Noting that Facebook had “limitless resources and an army of lawyers,” the group set a goal of raising $10,000. Within 30 minutes, 18 donors had given $1,195. Shortly afterward, the fund-raising goal was increased to $50,000.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Inside Facebook’s Push to Defend Its Image

The changes have involved Facebook executives from its marketing, communications, policy and integrity teams. Alex Schultz, a 14-year company veteran who was named chief marketing officer last year, has also been influential in the image reshaping effort, said five people who worked with him. But at least one of the decisions was driven by Mr. Zuckerberg, and all were approved by him, three of the people said.

Credit…Tommaso Boddi/Getty Images

Joe Osborne, a Facebook spokesman, denied that the company had changed its approach.

“People deserve to know the steps we’re taking to address the different issues facing our company — and we’re going to share those steps widely,” he said in a statement.

For years, Facebook executives have chafed at how their company appeared to receive more scrutiny than Google and Twitter, said current and former employees. They attributed that attention to Facebook’s leaving itself more exposed with its apologies and providing access to internal data, the people said.

So in January, executives held a virtual meeting and broached the idea of a more aggressive defense, one attendee said. The group discussed using the News Feed to promote positive news about the company, as well as running ads that linked to favorable articles about Facebook. They also debated how to define a pro-Facebook story, two participants said.

That same month, the communications team discussed ways for executives to be less conciliatory when responding to crises and decided there would be less apologizing, said two people with knowledge of the plan.

Mr. Zuckerberg, who had become intertwined with policy issues including the 2020 election, also wanted to recast himself as an innovator, the people said. In January, the communications team circulated a document with a strategy for distancing Mr. Zuckerberg from scandals, partly by focusing his Facebook posts and media appearances on new products, they said.

The Information, a tech news site, previously reported on the document.

The impact was immediate. On Jan. 11, Sheryl Sandberg, Facebook’s chief operating officer — and not Mr. Zuckerberg — told Reuters that the storming of the U.S. Capitol a week earlier had little to do with Facebook. In July, when President Biden said the social network was “killing people” by spreading Covid-19 misinformation, Guy Rosen, Facebook’s vice president for integrity, disputed the characterization in a blog post and pointed out that the White House had missed its coronavirus vaccination goals.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<